menu
Version
2017.2.10.6745
2024.1.4.8780
2023.1.12.8706
2022.1.19.8584
2021.1.14.8108
2019.2.15.7667
2019.1.11.7296
2018.1.11.6987
2017.2.10.6745
2017.1.9.6501
2016.2.6.6153
2015.1.9.5624
2024.1.4.8780
2023.1.12.8706
2022.1.19.8584
2021.1.14.8108
2019.2.15.7667
2019.1.11.7296
2018.1.11.6987
2017.2.10.6745
2017.1.9.6501
2016.2.6.6153
2015.1.9.5624
This integration provides a few components that can be used without code directly in a scene for the most frequent usage scenarios:
isDefaultListener
determines whether the game object will be considered a default listener - a listener that automatically listens to all game objects that do not have listeners attached to their AkGameObjListenerList's. isDefaultListener
determines whether the game object will be considered a default listener - a listener that automatically listens to all game objects that do not have listeners attached to their AkGameObjListenerList's. AkSoundEngine.SetState()
whenever the selected Unity event is triggered. For example this component could be set on a Unity collider to trigger when an object enters it. AkSoundEngine.SetSwitch()
whenever the selected Unity event is triggered. For example this component could be set on a Unity collider to trigger when an object enters it. This integration also provides a few classes that can be used, with minimal code, for most remaining usage scenarios:
There are four ways to add sounds to your game:
AK.Wwise.Event.Post()
at any time from a C# script.AkSoundEngine.PostEvent()
at any time from a C# script.AkSoundEngine.PostEvent
while in the current mode.For Unity's Timeline feature, there are custom Wwise tracks for triggering Wwise events and setting Wwise RTPC values.
In Wwise, Reverb Zones are called Environment
or Auxiliary
Sends
. Reverb Zones are not limited to being reverb effects and are defined in the Wwise project.
An AkEnvironment component embodies a very simple environment zone. You can attach an AkEnvironment to any type of collider. To add an AkEnvironment to your scene:
AkSoundEngine.SetGameObjectAuxSendValues()
at any time from a C# script.We also have portals which can be used to combine the effects of two environments. The contribution of each of the two environments is relative to their distance from the game object.
This is useful if a game object is standing between two rooms or in a tunnel connecting two environments.
To use environments and environment portals, you need a game object with an AkGameObj component that is environment-aware.
AkEnvironmentPortal objects will automatically detect AkEnvironment objects that overlap it. The overlapping environments will appear in the two select-lists in the portal's inspector. If too many environments overlap the portal, you can select which ones the portal will mix together.
In Wwise, only 4 environments can be active at the same time. Those 4 environments are selected as follows:
Most Wwise SDK functions are available in Unity through the AkSoundEngine
class. Think of it as the replacement of C++ namespaces AK::SoundEngine
, AK::MusicEngine
, and so on. See API Changes and Limitations for changes made in the API binding compared to the original SDK. For more complex situations, you'll need to call Wwise functions from code. In the API, the GameObjectID
in all functions is replaced by the Unity flavor of the GameObject. At runtime, an AkGameObj component is automatically added to this GameObject, unless you have already manually added it before.
The native Wwise API allows you to use strings or IDs to trigger events and other named objects in the Wwise project. You can still do this in the C# world by converting the file Wwise_IDs.h
to Wwise_IDs.cs
. Click Assets > Wwise > Convert Wwise SoundBank IDs. You need to have Python installed to make this work.
MIDI can be sent to Wwise by filling the AkMIDIPost
members of AkMIDIPostArray
class and calling any of the following methods:
AkMIDIPostArray.PostOnEvent()
AkSoundEngine.PostMIDIOnEvent()
AK.Wwise.Event.PostMIDI()
The following is a basic script that sends MIDI messages to the sound engine:
public class MyMIDIBehaviour : UnityEngine.MonoBehaviour { public AK.Wwise.Event SynthEvent; private void Start() { AkMIDIPostArray MIDIPostArrayBuffer = new AkMIDIPostArray(6); AkMIDIPost midiEvent = new AkMIDIPost(); midiEvent.byType = AkMIDIEventTypes.NOTE_ON; midiEvent.byChan = 0; midiEvent.byOnOffNote = 56; midiEvent.byVelocity = 127; midiEvent.uOffset = 0; MIDIPostArrayBuffer[0] = midiEvent; midiEvent.byOnOffNote = 60; MIDIPostArrayBuffer[1] = midiEvent; midiEvent.byOnOffNote = 64; MIDIPostArrayBuffer[2] = midiEvent; midiEvent.byType = AkMIDIEventTypes.NOTE_OFF; midiEvent.byOnOffNote = 56; midiEvent.byVelocity = 0; midiEvent.uOffset = 48000 * 8; MIDIPostArrayBuffer[3] = midiEvent; midiEvent.byOnOffNote = 60; MIDIPostArrayBuffer[4] = midiEvent; midiEvent.byOnOffNote = 64; MIDIPostArrayBuffer[5] = midiEvent; SynthEvent.PostMIDI(gameObject, MIDIPostArrayBuffer); } }
The audio input source plug-in can be used via C# scripting. See Audio Input Source Plug-in from the Wwise SDK documentation.
The following is a basic script that sends a test tone to the audio input source plug-in:
public class MyAudioInputBehaviour : UnityEngine.MonoBehaviour { public AK.Wwise.Event AudioInputEvent; public uint SampleRate = 48000; public uint NumberOfChannels = 1; public uint SampleIndex = 0; public uint Frequency = 880; private bool IsPlaying = true; // Callback that fills audio samples - This function is called each frame for every channel. bool AudioSamplesDelegate(uint playingID, uint channelIndex, float[] samples) { for (uint i = 0; i < samples.Length; ++i) samples[i] = UnityEngine.Mathf.Sin(Frequency * 2 * UnityEngine.Mathf.PI * (i + SampleIndex) / SampleRate); if (channelIndex == NumberOfChannels - 1) SampleIndex = (uint)(SampleIndex + samples.Length) % SampleRate; // Return false to indicate that there is no more data to provide. This will also stop the associated event. return IsPlaying; } // Callback that sets the audio format - This function is called once before samples are requested. void AudioFormatDelegate(uint playingID, AkAudioFormat audioFormat) { // Channel configuration and sample rate are the main parameters that need to be set. audioFormat.channelConfig.uNumChannels = NumberOfChannels; audioFormat.uSampleRate = SampleRate; } private void Start() { // The AudioInputEvent event, that is setup within Wwise to use the Audio Input plug-in, is posted on gameObject. // AudioFormatDelegate is called once, and AudioSamplesDelegate is called once per frame until it returns false. AkAudioInputManager.PostAudioInputEvent(AudioInputEvent, gameObject, AudioSamplesDelegate, AudioFormatDelegate); } // This method can be called by other scripts to stop the callback public void StopSound() { IsPlaying = false; } private void OnDestroy() { AudioInputEvent.Stop(gameObject); } }
By default, the AkGameObj
component is attached to a specific Unity gameObject
and uses its transform (with an optional offset) for full positioning. This is usually adequate for many games, such as first-person shooters. However, games with custom camera angles, such as many third-person games, may find it difficult to accommodate the two aspects of positioning (distance attenuation and spatialization) by simply attaching the audio listener to one game object, such as the main camera in Unity. Other games may want players to experience other custom positioning.
To this end, the AkGameObj
component class provides overridable positioning to Unity users. Through the three virtual methods GetPosition()
, GetForward()
, and GetUpward()
, users can derive a subclass from AkGameObj
and use that subclass component to customize any number of Unity gameObjects'
positioning.
Here is a simple example of how to use a custom component to override the default AkAudioListener
behavior. With a third-person project integrated with Wwise, remove the existing AkAudioListener
and its associated AkGameObj
. Then attach the following script to the MainCamera object, attach AkAudioListener
, and finally specify the target Unity gameObject
(such as the player avatar) that the audio listener's position will follow. After this, the distance attenuation of all the emitters will rely on the selected target Unity gameObject's
position as the listener position (an on-screen distance listener), while the orientation of all the emitters is still based on the main camera orientation as the listener orientation (an off-screen orientation listener).
#if ! (UNITY_DASHBOARD_WIDGET || UNITY_WEBPLAYER || UNITY_WII || UNITY_WIIU || UNITY_NACL || UNITY_FLASH || UNITY_BLACKBERRY) // Disable under unsupported platforms. // // Copyright (c) 2017 Audiokinetic Inc. / All Rights Reserved // using UnityEngine; using System; using System.Collections.Generic; [AddComponentMenu ("Wwise/AkGameObj3rdPersonCam")] [ExecuteInEditMode] //ExecuteInEditMode necessary to maintain proper state of isStaticObject. public class AkGameObj3rdPersonCam : AkGameObj { public Transform target; // The position that this camera will be following. User can specify this to the player character's Unity gameObject in the Inspector. // Sets the camera position to the player's position to handle distance attenuation. public override Vector3 GetPosition () { return target.GetComponent<AkGameObj> ().GetPosition (); } } #endif // #if ! (UNITY_DASHBOARD_WIDGET || UNITY_WEBPLAYER || UNITY_WII || UNITY_WIIU || UNITY_NACL || UNITY_FLASH || UNITY_BLACKBERRY) // Disable under unsupported platforms.
Questions? Problems? Need more info? Contact us, and we can help!
Visit our Support pageRegister your project and we'll help you get started with no strings attached!
Get started with Wwise