The Differences Between Working in Game Audio and Film: Part 1

게임 오디오 / 상호작용 음악 / 사운드 디자인

Have you been working in the film industry and are wondering what it would be like to work in game audio? Perhaps it's vice versa?  I’ve been working as a composer and sound designer for both games and films for the last 10 years.  On a macro level, we are basically delivering the same thing; sounds and music. On a micro scale, these two mediums can be vastly different, in terms of workflow, approach and environment.  In this two-part blog, I provide some ideas about some of the similarities and differences, and to make sure that I'm not just making things up, I got in touch with four renowned artists who shared some of their own thoughts and experiences.  What you'll find here are perspectives of course, but this should help you have a better understanding of how it might differ to work in one industry compared to the other. To clarify, this blog is mainly for newcomers who are starting our their audio career, or for those who may have only worked in one of the industries and were wondering what the other might offer.  Let’s dive into it!

wwiseblog-gameaudio-filmaudio1

THE WORK AND THE DELIVERY

Let’s first talk about the work. What we actually do, and what we have to deliver. A recurring theme is that with games you will deliver flexible sounds/music that can adapt to the gameplay whereas in film, it’s pretty much one fixed delivery that will not change much once it’s delivered. In film, we try to finely shape every little part of the sound/music to emphasize the narrative. In games, we need to give the option for the narrative to be stretched if the player decides to explore (all relative to the game and its mechanics obviously).

But how do these differences affect how we work? 

 

Sound design

Let’s say you have to create the sound of a door opening and closing. In film, you would just create that one sound for this one time you see the door opening and closing. But in games, you won’t know if the player will open and close the door 20 times, so you’ll need to provide variations because a door opening and closing in real life would not sound the same each time in real life, and creating variations will help not break the player's illusion. Since games run on software, we can potentially still manage with a few sound effects and then add variance through slight pitch and filter changes.

When delivering sound for a film, delivering a Pro Tools session for the length of the film that then goes to mixing is the standard. For games, there’s the big additional task of implementation where the sounds are integrated into the game engine so that they get triggered correctly throughout the game.  Game audio sound designers may or may not have the task for integrating the sounds, and may create variations directly using game audio middleware.  

 

Richard Gould on what the work entitles:

“One of the big things in film one has to consider is the production track. The production track is made up of whatever sound was recorded on set, typically consisting of dialogue but also sometimes sounds like footsteps, doors closing, things like that. In post production, we often include these sounds from the production track in the final mix so any sounds that we add, whether they’re foley or a sound effect, have to match the quality and feel of the production track. The same goes for dialog in that live action films often have a mix of production dialog and ADR (Automated Dialog Replacement) which also have to match. Sometimes, due to technical or practical issues, the production sound that was recorded isn’t optimal and you can find yourself fighting with the production sound trying desperately to make it sound better. This isn’t an issue in animation films (where there’s no production track) or with video games, as all of the dialog is recorded in the studio.”

Jamey Scott about his workflow and delivery: 

It’s a completely different dynamic. When I work on a film, it’s all about a linear production and putting the sounds in a format in Pro Tools so they can go down the assembly line into the mix. Now when I work on a game, I’m not working in a linear structure in Pro tools.  I’ll add a marker for instance labeling a type of gun and then I'll go in and make all of the variations for firing, recoil etc. And I'll always do multiples and I'll consolidate them into little files that then go into the game engine or audio middleware. So the delivery is very different because you deliver lots of individual files for games as opposed to one big single file in the different formats (i.e. Dolby Atmos, 5.1, stereo) for films.”

 

Music:

In film, the music is mainly written for a scene (scored to picture), and will have a specific length. You often fine-tune every detail to specifically fit each moment. A main consideration in film is to shape the music around potential dialog and sound design. The delivery will be one long file for a cue in potentially layers/stems (Drums, Brass, Synth etc.) to give some flexibility in the final mix/dub. The amount of layers varies depending on each project, and for smaller films with lower budgets  a single stereo mix delivery will sometimes do. 

In a game, most music has to be shaped around player interaction. Since it is unknown how long a player will stay in the same spot, the music needs to be interactive and have the flexibility to adapt to the pace of the player. There’s different ways to create interactive music. To name a few: looping stems, creating several layers/versions with different intensities that can be cross-faded, creating shorter music-sections for variations between stems, to make the music less repetitive. Cut scenes in games are basically small films within the game and this is where the two types of works are the most similar as the composer will just deliver one long file that fits the cut scene without the player’s interaction.

Garry Schyman on how he delivers things:

“In video games it seems a little counterintuitive because games are using such complex technologies but in video games I'm still mostly asked to deliver stereo wav files. Obviously you are separating out layers, intros and outros and those sorts of things that are part of the game's interactive dynamics. Sometimes I’m asked to deliver stems, especially with Sony.  I’ve only been asked twice to deliver 5.1 mixes which is more than normal in film. Even 7.1 mixes. and everything in 8-12 stems so they can re-mix it on the dub stage.”

Brian Tyler about how he approaches the work:

“In the beginning stages, technically that’s where they meet a little bit more. When composing for film, I don’t start writing for each scene at the beginning. I will first watch the film to get a feel for it, and in a game I will play a couple of levels to get an overview of the story and cut scenes etc. Then I’ll step away and start writing music and themes, trying to get some of the emotions that I got from watching the film or playing the game. But from thereon, they start to divert a bit.

In a game, there’s often these distinct areas and worlds that need specific music to represent them. It’s almost like 5-7 small movies where each have their own complete arc, whereas in a movie there’s kind of only one big arc. And in a game, because the music changes depending on what the player does, it must be written in a modular way so it’s cohesive. When it comes to movies, I’m in complete control of how the music works - it will sound the same way every single time so I can easily control how it will flow.”

When sound and music are delivered for a film, they are usually all brought together in one Pro Tools session and the film makers will create a final mix. On bigger productions, this is done on a dubbing stage, which is a room the size of a cinema  viewing room, to give the right impression for how the film will play in a movie theater. On smaller productions the final dub might just be done in a studio. The final mix is either done by the sound designers or a dedicated re-recording mixer, also called a dubbing mixer.

For games, when music and sound is delivered, it needs to get implemented in the game. This is either done directly in the game engine (i.e. Unity, Unreal etc.) or it will be implemented using  game audio middleware such as Wwise. The sounds in the game audio middleware will then be triggered from the the game engine. This implementation can be done by the sound designer or composer, or by the coders on the audio production team. The advantage of using game audio middleware is that it usually contains effects for panning, volume, and various tools to create sound variations. Using its tools simplify the mixing process, without requiring extra resources from the game development team in terms of coding. The mixing of a game is definitely as important as mixing a film, but it’s often done throughout the game's production and while play-testing, and not only as a final step in the process as it is the case for most films.   Technical implementation, including creating sound variations outside of the DAW and audio systems using middleware are becoming more and more a part of the sound designer or composer's job in game audio, as opposed to good skills to have. 

 

CREATIVE FREEDOM

This point is definitely very subjective but still an interesting one to discuss. I’ve heard so many different opinions about this, but they often lean a bit towards the idea that working in games provides sound designers and composers with a bit more creative freedom.

 

Sound design:

For sound in games it definitely depends on the size of game, but from my own experience and what others have shared with me, games have a tendency to give a bit more freedom. Both Richard Gould and Jamey Scott provide some good insights:

 

Richard Gould on creative freedom:

"I think the scale of a project, how large or small and whether it's being developed at a studio or if it's an indie development, affects the creative freedom. As far as trying to compare the two mediums;
I think there's more design opportunities in video games because you're often creating worlds that don't exist in reality. This isn't necessarily as true for film where you’re often creating a reality based on the world in which we live in. Now obviously there are exceptions to that but generally I find this to be true. Another reason why I think games sometimes offer you a little more creativity is the fact that you often have more time to develop ideas. (see next section)
Whereas in film you're often having to move quite quickly. It's not to say there isn't time to experiment [in film] but I think that there’s more time for experimentation in video games.”

 

Jamey Scott on creative freedom: 

I feel a little less literal when I work on games. In films (unless it's animated) if you get too far away from the reality of how a sound plays I feel like people check out, whereas in games it's all animated and I can do things that don't quite fit with the action. So maybe games are a little bit more creatively stimulating.

There's sort of linear scale of games. There's the triple A games where there might be an audio director who really knows their stuff and would not tolerate any sort of deviation from their vision. Then there are smaller games where they don't know what they want apart from sounding great. In films you're dealing with 100 years of tradition and filmmakers are generally very in tune with sound as they are with music. They know what they want and they know how to articulate it. So I feel with films I have to be a lot more careful with what I present.

 

Music:

Also in music many will argue that since you don’t have to fit specifically with timings and hit points in the picture it gives more creative freedom. Others will say that there’s a limitation in game music since it has to stay flexible (as described in the previous section) which sometimes forces you to limit musical content such as modulation (key changes) or tempo shifts, in order to make it loop-able or crossfade-able and easier to adapt to the gameplay.

 

Garry Schyman on creative freedom:

"I definitely feel there’s more freedom in games because your compositional decisions are inspired by what's going on but aren't necessarily locked to any specific image or moment or action by a character. Cut-scenes in games are scored like a film. So those are identical. So you have more compositional freedom [in games] but also more responsibility to the composer, at least that's how I view it."

 

Brian Tyler on creative freedom:

“I feel you have a good amount of freedom in a game because you’re doing things that aren’t specific to every single line of dialog, so you can step back a bit more and listen to the music as a whole. In movies, you can micro analyze things to the point of absurdity, trying to figure out if you should have music playing on that piece of dialog or not. Most moviegoers aren’t analyzing the music when watching a film. The score is manipulating emotions in the background. On the whole, film can be more micro analytical, but I’ve also worked on a game where they micromanaged the music to the point where there was no longer any real benefit."

Eventually I believe it comes down to what type of composer you are and what type of music you write. It’s important to understand the different “limitations” the two types of mediums demand and keeping that in mind as a creative starting point.

 

TIME

Artists today are required to deliver more content in shorter time periods, and for less money. This seems to be the tendency with all media forms these days. TV series always demand things with much shorter turnarounds than film. But how do the games and the film worlds compare?

 

Jamey Scott on time:

I feel right now in games there's maybe more of a time luxury than there is in films. Maybe because of the fact that I've gone from working on big films to working on TV shows just because that's where the work is right now. Often with films (and TV) you’re asked to deliver hard effects for the entire 110 minutes in 2 weeks. So you’re furiously scrambling to create that much content. With games I'll think about the time it's going to take to record and create all the assets but I'm also tacking on double that amount of time for implementation and technical back and forth.

 

Richard Gould on time:

“Film is such a well-established medium. Whilst there are certainly still technological developments and new delivery formats,  Dolby Atmos being a relatively recent example, generally the workflow is pretty set in place, both at a macro scale in terms of the processes of editorial, mixing and mastering, but also at an individual level in terms of people’s workflows and the toolsets . So the machine (in film) if you want to call it that, is well-oiled and capable of moving quickly, which is why I typically spend much less time on a film project than a game project. I spend at the most three to four months on a film whereas video games can span over a year in terms of the production schedule.”

 

Garry Schyman on time:

“In my experience with games you have much more time. That's probably because games tend to have a longer development process than films. They [game developers] tend to bring you in earlier in the process, either in the start or the middle. That also occasionally happens in films, where you hear about some composer being hired a year before the movie is finalized. But I would say more often than not you’re hired on a film where there's an edited picture to score and limited time to finish.”

 

Brian Tyler on time:

“On average, I’m on a game longer. There’s a game that I started 2 years ago that’s not coming out until next year. Granted, I tend to get tied in to pretty epic games, but there’s also times I’m on a movie and will start to speak with the director maybe a year or so before the premiere.” 

 

Stay tuned for Part 2 of this blog where we will cover some differences when it comes to building a career within each industry, a little on the social aspects, and how deals are made including how we get paid as sound designers and composers. 

 __________________________

Interviewees:
Jamey Scott has worked on films like Total Recall, Dark Tower, Slender Man and on games like Gears of War, Mafia III and Call of Duty.
Richard Gould has worked on films like Black Panther, The Post, Despicable Me 3 and on games like Arden's Wake, WWE Tap Mania and DreamWorks Press: Dragons.
Garry Schyman has worked on films like Itsy Bitsy, Brush with Danger, Spooky House and and games like The Bioshock Trilogy, Middle Earth: Shadow of War and Dante’s Inferno.
Brian Tyler has worked on films like Iron Man 3, Avengers: Age of Ultron, Fast & Furious and games like Far Cry 3, Assassin’s Creed IV: Black Flag and F1 2018.
 

Jesper Ankarfeldt

Composer and Sound Designer

Jesper Ankarfeldt

Composer and Sound Designer

Jesper is a Danish film composer and multi instrumentalist based in Los Angeles. He began learning music at age 6 and now holds a Master in Film Music from Conservatory van Amsterdam and a Master of Music from the prestigious USC Thornton's Department of Screen Scoring. Jesper has been making games since 2011, where he has engaged in both music, sound design, integration and game design. He has a passion for technology and loves sharing knowledge and ideas. http://www.ankarfeldt.dk/

댓글

Nimra Khan

June 19, 2019 at 07:23 pm

Wow, I'm a fan. Thank you for sharing this valuable information!

Eric Williamson

June 30, 2019 at 08:15 pm

Great article! For a long-time “traditional” television and film post-production engineer looking to expand into the gaming world, it was refreshing to hear from experienced actors in both worlds, and their opinions on the differences between them in terms of music and audio.

댓글 달기

이메일 주소는 공개되지 않습니다.

다른 글

Wwise를 사용하여 보다 복잡한 MIDI로 제어되는 샘플 악기만들기

MIDI 기반 음악으로 되돌아가는 데에 관심이 있는 사람들은, PS4의 내장 신디사이저를 이용할 수 없으며(존재하지 않음) PC 사운드카드의 GENERAL MIDI에 의존할 수도...

28.7.2020 - 작성자: 다니엘 벡 (DANIEL BECK)

Wwise에서 Audio Object를 저작하고 프로파일링하는 간단한 9 단계

Wwise에서 새롭게 제공되는 오브젝트 기반 오디오 파이프라인을 둘러보고 싶지만 어디서부터 시작해야 할지 모르시는 분들 계시나요? 그렇다면 Windows용 Wwise에서 Audio...

21.7.2021 - 작성자: 데미안 캐스트바우어 (Damian Kastbauer)

텔 미 와이(Tell Me Why) | 오디오 다이어리 제 1부: 환경음과 보이스오버

'텔 미 와이(Tell Me Why)'는 DONTNOD(돈노드)가 개발하고 Xbox Games Studios(엑스박스 게임 스튜디오)가 출판한 싱글 플레이어 서사적 어드벤처...

4.5.2022 - 작성자: 루이 마르탱 (Louis Martin)

Strata 작업 과정 파워업하기 | 제 1부 - “Open In Strata”

Strata 라이브러리의 렌더링된 파일을 REAPER의 Media Explorer에서 들어보고 원하는 것과 비슷한 사운드를 찾았지만 좀 더 필요에 맞게 수정하고 싶다고 해봅시다....

2.12.2022 - 작성자: 앤드류 코스타 (Andrew Costa)

Wwise를 사용한 반복 재생 기반 자동차 엔진음 디자인 | 제 1부

이 시리즈에서는 Wwise Authoring과 오디오 및 자동차 전문 지식을 알맞게 사용해서 간단한 반복 재생 기반 자동차 엔진 사운드를 디자인하는 방법을 살펴보려고 합니다! ...

18.4.2023 - 작성자: 아르토 코이비스토 (Arto Koivisto)

Strata 작업 과정 파워업하기 | 2부 - "연결된 REAPER 프로젝트 열기"

Wwise를 REAPER와 함께 사용하는 사운드 디자이너라면 Wwise에서 작업하는 동안 REAPER에서 사운드를 다시 렌더링하고 싶은 경우가 종종 있습니다. 원래대로라면 관련...

15.8.2023 - 작성자: Audiokinetic (오디오키네틱)

다른 글

Wwise를 사용하여 보다 복잡한 MIDI로 제어되는 샘플 악기만들기

MIDI 기반 음악으로 되돌아가는 데에 관심이 있는 사람들은, PS4의 내장 신디사이저를 이용할 수 없으며(존재하지 않음) PC 사운드카드의 GENERAL MIDI에 의존할 수도...

Wwise에서 Audio Object를 저작하고 프로파일링하는 간단한 9 단계

Wwise에서 새롭게 제공되는 오브젝트 기반 오디오 파이프라인을 둘러보고 싶지만 어디서부터 시작해야 할지 모르시는 분들 계시나요? 그렇다면 Windows용 Wwise에서 Audio...

텔 미 와이(Tell Me Why) | 오디오 다이어리 제 1부: 환경음과 보이스오버

'텔 미 와이(Tell Me Why)'는 DONTNOD(돈노드)가 개발하고 Xbox Games Studios(엑스박스 게임 스튜디오)가 출판한 싱글 플레이어 서사적 어드벤처...