Audiokinetic's Community Q&A is the forum where users can ask and answer questions within the Wwise and Strata communities. If you would like to get an answer from Audiokinetic's Technical support team, make sure you use the Support Tickets page.

0 votes
Hi,

We have been using Unreal Engine 4's sound engine.  When making a SOUND CUE (the equivilent of a WWISE AUDIO EVENT), we would attach a Local/Remote Node to our sound effects.  This would help us have two diffferent attentuation settings, one for the local player and one for the remote player.  The local player would be 2D the remote would attenuate in 3D.  Is there a way to do something similar in WWISE, where one AUDIO EVENT could playback both a local and remote players volume settings, and the game could recognize this and switch automatically?

 

Thank you,

Marc Mailand
in General Discussion by Nikki M. (140 points)

1 Answer

0 votes
 
Best answer
You should be able to do this with switch containers to differentiate between first and third person sounds. At least that's how I am currently doing it. You could then use SetSwitch to set an object to first person (player) or third person. This could also be used for player POV as well, or you may want to always use 1st person 2D sounds for your player character, that's personal preference.

Edit: As Mathieu from AK commented on another similar question: you could also link the POV RTPC to the 2D/3D parameter in the positioning tab
by Richard Goulet (5.8k points)
edited by Richard Goulet
...