Table of Contents
Table of Contents
There comes a point in game audio integration where the focus turns to the more subtle nuances of how the various sounds you’ve integrated into the game work with each other. It’s important that each sound contributes without becoming a distraction. This is where the art of mixing comes into play.
Compare and contrast this process with how sound is mixed in movie production. In audio for movies, all of the audio is collected together and played through a large mixing console that provides the engineer with quick access to controls used to modify hundreds of channels of sound until they all seamlessly blend, creating a uniform soundscape that compliments the picture. You can recreate a similar workflow with Wwise when you create custom virtual mixing consoles where object properties can be viewed in the form of mixer strips like channel strips seen on audio consoles or in the mixer view of a digital audio workstation. Furthermore, you can assign these properties to the physical knobs and faders of nearly any MIDI-based control surface, providing a tactile connection to the sound you’re adjusting. This way you can manipulate multiple properties simultaneously to speed up your workflow.
The major departure from a film-based workflow is that the engineer can simply click play on the project and all of the sounds will play according to when they occur on a timeline. Of course with video games there is no pre-determined timeline—sounds play based on when various events happen in the game. This makes testing how sounds relate to each other a bit more difficult.