1. First tell us about your background and story on how you got your foot into the video game composition and sound design industry.
I started composing music at 8 years old and I knew from the start that I wanted to be a composer. I spent my afternoons at the piano, writing dissonant chords and wonky melodies on paper (and I must say that very little has changed from that day).
I enrolled in the State Conservatory studying Organ and Organ Composition and got my first piece, a cello quartet, performed live at 14. I continued to compose music throughout all my studies, but after high school I was really unsure about what to do.
That’s when I decided to enroll in Electronics Engineering and spend years scratching my head over programming, signal theory and digital systems. After graduating I enrolled in another University with a Computer Science degree focused on Sound and Music Technology.
There I started to experiment with sound design, audio programming and got really interested in video games. I always played games but I had never considered a career developing them before, so I thought “Perfect! If I can’t decide between art and tech, I’ll do both!”.
Basically I holed up in an apartment for half a year with a classmate and we started developing our own game. During this time I happened to meet with the lead of a small new team with several veterans from the game industry.
They where looking for a composer and sound designer so I sent them a couple of demos and got my first job. In the end the project got shut down because of funding issues but months later I got introduced from one of the former members of the team to 34BigThings, they were also looking for a composer and sound designer. I was in the middle of my finals but I jumped on the opportunity and transferred there to work with them.
2. How do you actually do sound design for video games, what is your workflow?
The game industry is different than any other entertainment industry since it merges the creative process with software development. Very often the production pipeline is based on iterative processes with a flexible approach derived from the Agile framework.
We usually start by discussing game specifics with the game designers and programmers. We review the Game Design Document and identify the key parts of the game where audio can help the gameplay and narrative design to convey the right messages and give feedback to the players.
This is the pre-production phase, where we test aesthetics, techniques and structure systems and it can last from a few days to months depending on the size of the project.
At the end of pre-production we have already prepared an Audio Design Document and audio asset list and start to design the audio systems in terms of structure, events and code, using placeholder assets.
The audio assets will be reworked over and over, testing them in game at every iteration until we have the final version. We also tweak and optimize the audio system to make it work seamlessly and with good performances (and this often means lots of bug testing from the QA and sleepless nights).
Sometimes there could be major changes in the gameplay or the tech used in the middle of production, and this also adds a degree of difficulty to the job!
3. What tools do you use to do sound design (microphones, hardware, software…)?
My main DAWs are Cubase and Reaper. I use Cubase to compose music but when I have to design sounds I find Reaper to be more flexible for the job.
I usually use wavetable synths like Massive from NI and Serum from Xfer Records to design stuff like Sci-Fi weapons, spaceships and UI. I have lots of plugins but my favourites are the ones from SoundToys. I always manage to obtain some interesting textures or resonances with them.
We use sound libraries for foleys, some recorded by us during the years and some commercial, but when we need to record something we have an external studio.
I also have a small Tascam DR-40 that comes in handy when I’m traveling. I use Wwise, FMOD and Fabric as middlewares, depending on the project size and specifics.
At 34BigThings we develop games in Unreal Engine 4 and Unity, and some of the projects were done using their internal audio system, without relying on middlewares; so if the job requires it I may also have to script the audio methods in C# or blueprints.
4. For aspiring sound designers who wants to break into the video game scene, can you give some advice?
Be curious and stay hungry. The industry is constantly evolving in terms of technology and creativity, so it can be overwhelming at first, but don’t give up!
Learn the basis for acoustics, psycho-acoustics, sound recording, sound synthesis and then study the tools for game developing: middlewares, game engines and, last but not least, software development frameworks.
There are many good courses out there, but nowadays you can find plenty of information with a Google search, so it’s definitely easier than before. Also, go to conventions and talk to developers and other sound designers, it helped me a lot when I had just started.
5. What are your goals for your career in music?
I achieved one of my long time dreams last year: to have my soundtracks performed live by an orchestra in concert. It was an amazing experience and I cannot wait to repeat it this year!
My main goal has always been the same: to work on interesting projects and transmit emotions to the players through music. I think that’s the only thing that really matters for a composer.
6. Anything else you want to add, words of wisdom and motivation perhaps?
I often had students come to me asking for the difference between music and sound. The question may seem trivial but the truth is that there isn’t a real distinction between them, and the fact that today we can have a film or game score made of soundscapes, drone textures and sound design elements proves it.
John Cage taught us that “there is no noise, only sound”. We live in a time where everyone can freely experiment and push the boundaries between elements until they break. Be a boundary-breaker!