Version

menu_open
Warning : Some protected information on this page is not displayed.
Ensure you are logged in if you are a licensed user for specific platforms.
Wwise SDK 2023.1.8
Integrating Wwise Motion

You can use Motion plug-ins to control the haptic feedback of a control interface. With Wwise, you can use the same set of tools to manage motion and audio in your application. Internally, motion data is no different than audio data, and all the features available for audio are also available for motion. Two types of haptic feedback are available through the motion feature: you can transform any audio signals in your project and into motion, or generate dedicated motion signals with the Motion source. You can test motion directly in Wwise Authoring on Windows with a supported controller.

Motion Components

Motion uses the Wwise Sound Engine plug-in system to work in an application, and can be subdivided into two modules: the audio source called Motion and the Audio Device called Wwise Motion. Although the Motion audio source is optional, it remains a powerful tool to create accurate and flexible motion designs.

Wwise Motion Audio Device Plug-in

The Wwise Motion audio device plug-in links the sound engine to motion-ready devices. Just like any audio device plug-in, it receives data from a set of Listeners and "presents" this data to a device. This plug-in is inside a separate library and needs to be included in both Wwise Authoring and the application. Refer to the Setting Up Motion section for more information.

Motion Source Plug-in

You can use the Motion source plug-in to design the behavior of haptic feedback effects. Just like any audio source, you can add a Motion source plug-in to a Sound SFX node in your Wwise project. Make sure that the Sound SFX nodes have Output Busses set to motion-ready busses. See Motion for more information.

Setting Up Motion

To use motion in your application, you need to properly set up each component. All the concepts applicable to the audio workflow are also applicable to motion. It uses the same busses, Listeners, and Emitters (see Integrating Listeners).

Wwise Authoring Setup

To be able to send either sound or motion data to a device, you need to add the licensed Wwise Motion Audio Device to the Audio Device folder of your Wwise project, located in the Audio tab of the Project Explorer. The Wwise Motion Audio Device is the plug-in used by the sound engine to interface with a motion-ready device. It is also crucial to assign the Wwise Motion Audio Device to a top level Audio Bus. The term motion bus denotes a top-level Audio Bus with a Wwise Motion Audio Device assigned to it. It is good practice to use a single motion bus hierarchy in your project for easier troubleshooting and monitoring. You can then set the Output Bus of any Sound SFX to a motion bus to create haptic feedback. Usually the Sound SFX elements that use a motion bus also use a Motion source. To simultaneously have audio and motion, a Sound SFX needs to have at least one motion bus and one Audio Bus, either as the Output Bus or as an Auxiliary Bus.

Game Setup

On the game side, link with the separate library called AkMotionSink. This library provides support for the standard controllers of supported platforms. You also need to include the AkMotionSinkFactory.h file, located under SDK\include\AK\plugin. It is important to include this file because it automatically registers the plug-in.

Note: In Unity and Unreal, plug-in libraries are managed automatically. You don't have to add AkMotionSink manually.

Refer to the following table for the list of supported controllers and additional requirements.

PlatformDeviceDevice Channel Config and LayoutAdditional Requirements
AndroidAndroid device with vibration supportAnonymous 1-channel
iOSiOS device with vibration support
iOS-compatible Controllers
Anonymous 1-channelCoreHaptics.framework
GameController.framework
LinuxNot supported.
MacMac-compatible ControllersAnonymous 1-channelCoreHaptics.framework
GameController.framework
OpenHarmonyOpenHarmony device with vibration supportAnonymous 1-channel
PlayStation 4DUALSHOCK 4
PlayStation Move
Anonymous 2-channel:
Left motor, right motor
PlayStation 5DualSense
VR Controllers
Stereo 2-channel:
Left vibration, right vibration
SwitchJoy-ConAnonymous 4-channel:
Left low-freq vibration, left high-freq vibration,
right low-freq vibration, right high-freq vibration
WindowsXbox and XInput-compatible Controllers
DirectInput-compatible Controllers
Anonymous 2-channel:
Left motor, right motor
XInput.lib
Dinput8.lib
Winmm.lib
Xbox One
Xbox Series X
Xbox ControllersAnonymous 4-channel:
Left motor, right motor,
left trigger, right trigger


If your application uses motion with one or more devices, you must add a dedicated output for each device. For example, a split-screen game with four players connected needs four different outputs for the controllers to receive haptic feedback. To add an output device, use the Wwise API's AK::SoundEngine::AddOutput function and specify the ShareSet name (as defined in your Wwise project) in the AkOutputSettings parameter. Additionally, because multiple devices can be connected, you need to provide a device ID. The following table contains more information about device IDs.

PlatformDeviceInformation
AndroidAndroid device with vibration supportUse 0.
iOSiOS device with vibration support and iOS-compatible ControllersTo vibrate the device, use 0. To vibrate a connected controller, first assign a player index to the desired GCController instance using the GameController.framework API. Then, retrieve the DeviceID of that player index by calling AK::SoundEngine::GetDeviceIDFromPlayerIndex.
If want to use resident mode, add AKMOTION_RESIDENT_MODE to DeviceID. Normally, vibrations are ignored for a few tens of milliseconds when starting from silence, but this delay is eliminated when running in resident mode. However, running in resident mode slightly increases the power consumption of the device.
LinuxNot supported.-
MacMac-compatible ControllersFirst, assign a player index to the desired GCController instance using the GameController.framework API. Then, retrieve the DeviceID of that player index by calling AK::SoundEngine::GetDeviceIDFromPlayerIndex.
If want to use resident mode, add AKMOTION_RESIDENT_MODE to DeviceID. Normally, vibrations are ignored for a few tens of milliseconds when starting from silence, but this delay is eliminated when running in resident mode. However, running in resident mode slightly increases the power consumption of the device.
OpenHarmonyOpenHarmony device with vibration supportUse 0.
PlayStation 4DUALSHOCK 4 and PlayStation MoveUse the handle of the device returned by scePadOpen or scePadGetHandle.
PlayStation 5DualSense and VR ControllersUse the handle of the device returned by scePadOpen or scePadGetHandle. For PSVR2, create only one output device, using the handle of either the left or right VR controller, to have vibrations on both VR controllers. If you specify a device ID of 0, the system is only initialized for the wireless controller, not the VR controllers.
The Advanced vibration control mode, which is required for haptic feedback, is enabled in the System Software by default for DualSense and VR Controllers on PlayStation 5. If your code calls scePadSetVibrationMode, ensure that you do not use any behavior that disables Advanced vibration control mode, such as setting a vibration mode of SCE_PAD_VIBRATION_MODE_COMPATIBLE.
SwitchJoy-ConUse nn::hid::NpadId with the desired index.
WindowsXbox and XInput-compatible ControllersUse the player index between 0 to 3.
WindowsDirectInput-compatible ControllersUse the guidProduct stored in a DIDEVICEINSTANCE. Hash the guidProduct by using an AK::FNVHash32.
Xbox One (XDK)Xbox ControllersUse the ID stored in a IGamepad object.
Xbox One (GDK)
Xbox Series X
Xbox ControllersRetrieve the DeviceID of a gamepad by calling AK::SoundEngine::GetGameInputDeviceID.


Note: On all platforms except Windows, a device ID of "0" targets the first available device that supports motion.

Remember that game controllers can be disconnected either physically or due to communication problems. Disconnection does not have any adverse effects on the sound engine other than needlessly using resources. If you think a device has been disconnected for a long time, call AK::SoundEngine::RemoveOutput and provide the AkOutputDeviceID returned by the corresponding AddOutput() function call.

Multiplayer Considerations

Motion outputs are like any other Secondary Outputs and therefore have the same restrictions and requirements. If you are making a single-player game, in which only one player controls the game locally, the Listener/Game Object setup is very simple. In normal cases, the new motion output re-uses the same default Listener as the main audio output. In other words, you rarely have to manage Listeners in a single-player setup.

For multiplayer games, you must create one Listener/Game Object for each motion output. This is necessary so that each player has their own mix of haptic feedback, depending on the situation in the game. The Listener associated with a device must be initialized at the same time as the output is initialized by AK::SoundEngine::AddOutput(). Specific Listeners provide an additional layer of routing for sounds or motion. You can target a specific player if you play an Event on a Game Object that is uniquely associated with that player's Listener. To establish this association, you can call AK::SoundEngine::SetListeners. You can also associate multiple listeners with the same game object, which has a "broadcast" effect on all listeners. See Integrating Listeners for more information about Listeners and Game Objects.

Note: A sound needs to be routed into the motion bus hierarchy even if the Emitter has a Listener set to a motion output.

Examples

The following examples demonstrate how to set up your application to use motion. You can also refer to the Demo Motion example in the Integration Demo (DemoMotion.cpp) included in the SDK samples. It provides working examples for all the supported platforms.

Setup for a single player

First, as with any other plug-in, you need to include the corresponding file and link the library (for Wwise native development only, not required for Unity).

Next, add another output with the Wwise Motion Audio Device ShareSet name from the Wwise project, in this case "Wwise_Motion". The output ID is 0 to ensure that the first connected game controller is used.

Next, play an Event. In the Wwise project, the "Play_Explosion" Event is linked to a Sound SFX that is routed to a bus that has "Wwise_Motion" ShareSet assigned as its Audio Device.

Setup for multiple players

This section describes how to set up motion for multiple players on the same console, not a networked multiplayer game. In multiplayer scenarios, the mix for motion, or any player-specific output, must be different to represent the player's perspective in the game world. Players therefore need their own Listeners.

Add an output for every player. You can use an Audio Device ShareSet multiple times, so you don't have to create multiple Audio Device ShareSets. You must provide a real device ID for every controller. This example is for Xbox controllers on Windows. Refer to the table in Game Setup for information on how to retrieve device IDs for specific platforms.

Next, play an Event. In the Wwise project, the "Play_GunFire" Event is linked to a Sound SFX that is routed to a bus that has "Wwise_Motion" assigned as its Audio Device ShareSet.

To play an Event that affects multiple devices, set up a new Game Object, and ensure that all player-specific Listeners listen to it.

For an example of multiplayer setup, refer to the DemoMotion class in the IntegrationDemo sample.

Checklist and Troubleshooting

For a specific device to receive motion with one device, you must:

  1. Add a Wwise Motion Audio Device ShareSet to your project.
  2. Create a top-level Audio Bus and set its Audio Device to the Wwise Motion Audio Device ShareSet you just added.
  3. Create Sound SFXs and route them to the motion bus hierarchy.
  4. In the game, link to the AkMotionSink library. Include AkMotionSinkFactory.h and link with the AkMotionSink library.
    Note: In Unity and Unreal, plug-in libraries are managed automatically. This step is unnecessary.
  5. Create a Game Object to call and receive motion Events with AK::SoundEngine::RegisterGameObj.
  6. Call Ak::SoundEngine::AddOutput with the Wwise Motion ShareSet and a device ID. For multiplayer, provide a Listener object for each.
  7. Trigger your Events the same way you do for audio Events.

To add Motion to the Wwise project:

  1. Add a Wwise Motion Audio Device ShareSet to your project.
  2. Create a top-level Audio Bus and set its Audio Device to the Wwise Motion Audio Device ShareSet you just added.
  3. Create Sound SFXs and route them to the motion bus hierarchy.
  4. In the Audio Preferences, locate the busses you want to use motion with and, from the list, set the device of each bus to the motion device of your choice.

Profiling

To troubleshoot problems, we recommend that you profile your application. In Wwise Authoring, you can profile your application as described in Profiling. Several tools can help you understand the source of a problem. The Capture Log view lists error codes in red. The Graph view displays a visual representation of the sound engine pipeline. The motion devices appear at the end of the pipeline. If no motion devices are displayed, the sound engine could not find the device you specified. Finally, the Emitter/Listener tab displays all Emitter-Listener pairs. If the Motion Effect does not work, the Emitter might not be associated with the Listener you specified for your motion device.

If you trigger a Motion effect but it doesn't work, do the following:

  • Check the Capture Log to ensure that there are no registration errors with the plug-in. If there are, confirm that you included the AkMotionSinkFactory.h and linked to the AkMotionSink library.
  • Ensure that the sound engine initialized the device. If it didn't, an error is displayed in the Capture Log when AddOutput is called from the game.
  • Ensure that the Sound SFX you play is routed to a bus that leads to a motion device. Check the Audio Bus property of the Sound SFX, and the Audio Device property of the bus.
  • From the Voices Graph tab, identify which Emitter triggered. Ensure that there is an associated Listener in the Emitter-Listener tab. The same Game Object can emit and listen. If there is no associated Listener, review your calls to RegisterGameObj, SetListeners, and AddOutput.
  • Ensure that the Listener is associated with the desired output device.
  • Review the results of any API calls you made and check for errors.

For Android devices, add the permission in the application's AndroidManifest.xml file:

AkUInt64 AkGameObjectID
Game object ID.
Definition: AkTypes.h:60
AKSOUNDENGINE_API AKRESULT RegisterGameObj(AkGameObjectID in_gameObjectID)
AkUInt32 AkDeviceID
I/O device ID.
Definition: AkTypes.h:78
Platform-independent initialization settings of output devices.
#define AKMOTIONSINK_DYNAMIC_LINK_SCEPAD_FUNCTIONS
AKSOUNDENGINE_API AKRESULT SetListeners(AkGameObjectID in_emitterGameObj, const AkGameObjectID *in_pListenerGameObjs, AkUInt32 in_uNumListeners)
#define AKMOTIONSINK_STATIC_LINK_SCEPAD_FUNCTIONS
AKSOUNDENGINE_API AKRESULT AddOutput(const AkOutputSettings &in_Settings, AkOutputDeviceID *out_pDeviceID=NULL, const AkGameObjectID *in_pListenerIDs=NULL, AkUInt32 in_uNumListeners=0)
#define AKMOTION_SCEPAD_HAPTICS_MODE
uint32_t AkUInt32
Unsigned 32-bit integer.
AKSOUNDENGINE_API AkPlayingID PostEvent(AkUniqueID in_eventID, AkGameObjectID in_gameObjectID, AkUInt32 in_uFlags=0, AkCallbackFunc in_pfnCallback=NULL, void *in_pCookie=NULL, AkUInt32 in_cExternals=0, AkExternalSourceInfo *in_pExternalSources=NULL, AkPlayingID in_PlayingID=AK_INVALID_PLAYING_ID)

Was this page helpful?

Need Support?

Questions? Problems? Need more info? Contact us, and we can help!

Visit our Support page

Tell us about your project. We're here to help.

Register your project and we'll help you get started with no strings attached!

Get started with Wwise