The Integration Demo application contains a series of demonstrations that show how to integrate various features of the sound engine in your game.
|
Note: All code presented in this section is available in a sample project in the "samples\IntegrationDemo\$(Platform)" directory. |
The Integration Demo binaries are available in the "$(Platform)\[Debug|Profile|Release]\bin" directory. If you would like to rebuild the application yourself, follow these steps:
To run the Integration Demo, simply launch the executable found in the directory mentioned above.
|
Note: The banks are not included in the installer and have to be generated using the authoring tool. |
|
Note: The banks are not included in the installer and have to be generated using the authoring tool. |
|
Note: You will need to use the software keyboard or hardware keyboard to interact with the integration demo. The software keyboard can be opened by holding the menu button for 2 seconds. |
You can navigate through the Integration Demo on Windows using either the keyboard, a connected controller or any DirectInput compatible device.
Certain controls (such as Toggle Controls and Numeric Sliders) allow you to change values. To change their values, hit the LEFT and RIGHT arrow keys or the LEFT and RIGHT buttons on a gamepad's directional pad.
|
Tip: The application has an online help feature! To access the Help page, press F1 on the keyboard or the START button on a gamepad. |
The code behind each demonstration can be found in the "samples\IntegrationDemo\DemoPages" directory. For example, the code for the Localization demo will be in the DemoLocalization.h and DemoLocalization.cpp files in that directory.
|
Tip: Pertinent information about each demo can also be found in the Integration Demo application's online help. |
This demo shows how to implement localized audio. Localized sound objects are found in language-specific SoundBanks in subdirectories in the SoundBank generation directory. We achieve the localization effect by unloading the current SoundBank and reloading the desired language-specific Soundbank.
Use the "Language" Toggle control to switch the current language. Then press the "Say Hello" button to hear a greeting in the selected language.
For more information about languages and localization, see Integration Details - Languages and Voices
The Dynamic Dialogue demo runs through a series of tests that use Wwise's Dynamic Dialogue features. Each of these tests demonstrates a different control flow so that you can hear the effect it produces:
For more information about Dynamic Dialogue, see Integration Details - Dynamic Dialogue
This demo shows how to use RTPCs. The RPM numeric slider is linked with an RTPC value (RPM) associated with the engine. Press the "Start Engine" button to start/stop car engine audio. Use the RPM slider to change the RTPC value and hear the effect.
For more information about RTPCs, see Integration Details - RTPCs
This demo shows various ways to implement footsteps in a game. It also show surface-driven bank management to minimize both media and metadata memory when a surface isn't in use. Finally, this demo also shows a very simple case of environmental effects.
In this example, the footstep sounds are modified by 3 variables: surface, walking speed and walker weight.
With each surface, we show a different way of dealing with the sound samples and variables. These are only suggestions and ideas that you can use in your own structure.
In this demo, the banks were divided in four media banks (one per surface). We divided the screen in 4 with a buffer zone between each surface where both banks are loaded. This is to avoid a gap in the footsteps due to bank loading. In the bank manager, look at the GameSync tab. Note that each surface bank includes only the corresponding surface switch. This will include only the hierarchy related to that switch in the bank, and nothing else. In a large game, this setup has the advantage of limiting the amount of unused samples in a particular scenario, thus limiting the memory used. For level or section based games, it is easy to identify the surfaces used as they are known from the design stage. For open games, this is more tricky and depend a lot on the organization of your game but can still be acheived. For example, it is useless to keep the "snow and ice" surface sounds in memory if your player is currently in a warm city and won't be moving toward colder settings for a long time.
This demo shows how you can set up a callback function to receive notification when markers inside a sound file are hit. For this demonstration, we are using the markers to synchronize subtitles with the audio track.
For more information on markers, see Integrating Markers
Music Sync Callback Demo - This demo shows how to use music callbacks in general. Beat and bar notifications are generated from music tempo and time signature information. Music Playlist Callback Demo - This example to force a random playlist to select its next item sequentially. The playlist item may be stopped via the callback as well. MIDI Callback Demo - Shows MIDI messages the game can receive using callbacks. MIDI messages include the MIDI notes, CC values, Pitch Bend, After Touch and Program Changes.
For more information on music callbacks, refer to Integration Details - Music Callbacks
This example uses a Music Switch container. Try switching the states by triggering the event listed in the demo page. Switching state might produce a result that is immediate or occur at the time specified in the rules of the music container.
This is a multiplayer demonstration which shows how to integrate Wwise's motion engine into your game.
In this demonstration, each player has the option to either close a door in the environment or to shoot a gun that they are holding. A listener is set for each player which is active on the door game object as well as the player's own gun. This way, if any player closes the door in the environment, all players receive force feedback reactions. However, only the player who fired his weapon receives force feedback for that event.
|
Note: A player using a keyboard should plug in a gamepad to participate in this demo. |
For more information on the Wwise Motion Engine, see Integrating Wwise Motion
This demo shows how to record audio from a microphone and input it in the Wwise sound engine. In the Integration Demo select the "Microphone Demo" and speak into the microphone to hear your voice played back by the Wwise sound engine. Toggle the "Enable Delay" to hear an example of how audio data fed to the Audio Input plug-in can be processed like any other sound created in Wwise.
Each platform has a very different core API to access the microphone. Check the SoundInput and SoundInputMgr classes in the Integration Demo code to see how they interact with the AudioInput plug-in.
The microphone sample was tested using "Logitech USB Microphones" on all platforms.
See also Audio Input Source Plug-in.
This demo shows how to use external sources. Both buttons play the same sound structure, but set up at run-time with either sources "1", "2" and "3" or sources "4", "5" and "6".
For more information on the external sources feature, see Integrating External Sources.
Additionnally, the external sources are packaged in a file packager and loaded when opening the demo page. Refer to the Wwise Help for more information on the File Packager, and to the Streaming / Stream Manager chapter for more details on the run-time aspect of file packages.
The Wwise project for this program is also available in "samples\IntegrationDemo\WwiseProject".
|
Note: The Wwise project for this program uses various audio file conversion formats, some of which may not be available depending on which platforms are supported by your Wwise installation. After opening the project in Wwise, you may see warnings such as: '\Actor-Mixer Hierarchy\Dialogues\Captain_A\UNA-BM-AL_01\UNA-BM-AL_01' uses the conversion plugin 'XMA', which is not installed. You can make these messages disappear by changing the conversion format for all unavailable platforms to PCM. Refer to the following topic in the Wwise User Guide for more information: Finishing Your Project > Managing Platform and Language Versions > Authoring Across Platforms > Converting Audio Files. |
SoundBanks for this project are also installed with the SDK in the "samples\IntegrationDemo\WwiseProject\GeneratedSoundBanks" folder.
To regenerate the SoundBanks, make sure to do the following in the SoundBank Manager:
Once these settings are correct, you can click on Generate in the SoundBank Manager to generate the banks.
The Integration Demo as well as its Wwise Project are kept very simple in order to demonstrate the basics of sound engine integration. For a more realistic integration project, refer to the AkCube Sound Engine Integration Sample Project.
Questions? Problems? Need more info? Contact us, and we can help!
Visit our Support pageRegister your project and we'll help you get started with no strings attached!
Get started with Wwise