Version

menu_open
Wwise SDK 2024.1.0
Third-Person Perspective and Spatial Audio
Note: For an introduction to working with listeners in third-person perspective applications, refer to Working with Listeners in Third-Person Perspective Games. This section elaborates with details pertaining specifically to Spatial Audio.

When working with third-person perspective (TPP) experiences with Wwise Spatial Audio initialized, there are some special considerations.

Wwise Spatial Audio performs sound propagation calculations using ray-based acoustic simulation techniques. In broad terms, we can think of Spatial Audio as solving for sound propagation paths between a sound emitter and a listener. A sound path may be subject to:

  • Reflection: bouncing off a surface.
  • Diffraction: bending around an object, surface, or portal.
  • Transmission: passing through an object, surface, or room.

In general, there are many paths between a single sound and a listener, and they are combined to produce the final rendered sound. For more general information on Spatial Audio, refer to Spatial Audio.

In a TPP experience, it’s not uncommon to have discrepancy between what the camera experiences and what the character experiences. In such scenarios, it may be difficult or impossible to follow simple rules for rendering distance attenuation and filtering based on the Distance Probe and panning/spatialization based on the camera position. Calculations performed for acoustic simulation can be CPU intensive, and it’s not possible, nor desirable, to perform the simulation for both the Listener Game Object and the Distance Probe Game Object.

It is however still possible to assign a Distance Probe to the Spatial Audio Listener. The general approach taken by Wwise Spatial Audio is to use the Listener position for the purpose of sound propagation calculations. The distance value passed to the sound engine is augmented to give results closer to what one would expect had the path been computed to the Distance Probe Game Object. The methods used to augment the distance value differ based on the type of path; more details on what that means for reflection, diffraction and transmission paths are provided in the following sections. This approach ensures that spatialization and panning are always true to the simulation and do not disrupt the player's sense of immersion. However, when the position of the Distance Probe and the position of the Listener are highly divergent, the attenuation value may give slightly unexpected results. In any case where Spatial Audio features are in use, but a path computation results in an unobstructed straight line, attenuation results are exactly as expected.

The following sections describe in detail how the various Spatial Audio features operate when a Distance Probe is assigned to the Spatial Audio Listener.

Diffraction and Virtual Positioning

The diffraction system in Spatial Audio is primarily responsible for rendering the dry signal path for a particular sound and listener. Often, a single sound will have multiple diffraction paths.

  • A diffraction path is calculated according to the position of the Emitter and Listener game objects. Diffraction points are found according to geometry and/or portals along the path.
  • A virtual sound position is calculated.
  • The sound is panned according to the virtual position, as per usual.
  • The diffraction coefficient is calculated according to the path between the Emitter and the Listener Game Objects and results in further attenuation and filtering of the sound.
  • The distance used to evaluate the attenuation curve of the sound is the distance between the virtual position and the Distance Probe Game Object.

Transmission

For the dry signal path, transmissive sounds can be used with Distance Probes with no special consideration, since transmission paths always follow a straight line.

Reflections

The reflection system in Spatial Audio uses the Reflect plug-in to render early reflections for a sound and a Listener.

  • A reflection path is calculated by casting rays from the Listener.
  • When a path is found, it is traced back from the Emitter, bouncing off a number of surfaces, and ending up at the Listener. An image source position is calculated and passed to the Reflect plug-in.
  • In the Reflect plug-in:
    • The reflection is panned according to the position of the Listener Game Object, and the image source position.
    • The reflection is attenuated according to the distance between the image source and the Distance Probe Game Object. The curve used to evaluate the attenuation of a reflection is determined by the settings of the Reflect plug-in.

Rooms and Portals

The rooms and portals system, while also used for diffraction calculations for the dry signal path, is the primary method to spatialize reverberation. Spatialized reverb includes the wet signal path and can generally be thought of as either reverb output from a plug-in, or a room tone playing directly on the Room Game Object.

  • When the Listener is inside a room that’s excited by reverb or a room tone:
    • The reverb and/or room tone is omnidirectional. The sound field of the room is counter-rotated such that it remains fixed to the room orientation regardless of the Listener’s orientation.
    • If the Distance Probe is also inside the room, the distance is evaluated as 0.
    • If the Distance Probe is outside the room, then the distance between the room and the room extent bounding box is used to evaluate the attenuation curve. Refer to the following image demonstrating this scenario.
  • When the listener is outside a room that’s excited by reverb or a room tone:
    • The sound from the room is spatialized according to the positions of the portals leading to the room.
    • Each path leading from a portal to the room also has an associated virtual position. The virtual position is calculated so that the contribution from the portal is panned to match the angle of the path from the portal, incident with the listener.
      • If the Distance Probe is also outside the room, the distance between the Distance Probe and the portal’s virtual position is used for attenuation evaluation.
      • If the Distance Probe is inside the room, then the distance to the portal’s virtual position is evaluated as 0.
    • For the transmission path of the room, the rules are the same as when the Listener is inside the room (except in this case, a transmission loss is applied to the sound).
      • If the Distance Probe is inside the room, the distance is evaluated as 0.
      • If the Distance Probe is outside the room, then the distance between the room and the room extent bounding box is used to evaluate the attenuation curve.

An Example Scenario

The following annotated image of the Game Object 3D Viewer shows an example of a room with a single portal, where both the Distance Probe and the Listener are outside the room.

Distance: Distance Probe to portal’s virtual position. This distance is used to evaluate the contribution of the portal.
Distance: Distance Probe to room extent. This is the distance between the Distance Probe and the closest point to the Distance Probe on the surface of the bounding box defining the room extent*. This distance is used to render the transmission path.
Direction: Listener to portal’s virtual position. This direction is used to pan the contribution of the portal.
Direction: Listener to room extent. This direction is used to pan the contribution of the transmission path.


*Room extent visualization can be enabled in the Game Object 3D Viewer Settings by setting the Room Extent Opacity slider to a value greater than zero.


Was this page helpful?

Need Support?

Questions? Problems? Need more info? Contact us, and we can help!

Visit our Support page

Tell us about your project. We're here to help.

Register your project and we'll help you get started with no strings attached!

Get started with Wwise