Audiokinetic's Community Q&A is the forum where users can ask and answer questions within the Wwise and Strata communities. If you would like to get an answer from Audiokinetic's Technical support team, make sure you use the Support Tickets page.

0 votes
I need to play the sound in stereo when the listener player and the emitter of sound are both in one area (for example, in one room) and play sound in mono, when the sound emitter and listener are in different areas, separated by some geometry (for example, when the listener hears sound occluded by the wall). Is there any way to do so automatically for all sounds being occluded?
I'm using unreal engine 5.1 and Wwise 2022.1.2.8150
in General Discussion by Anton Z. (150 points)

1 Answer

–1 vote
 
Best answer
Hey Anton.

The quick answer is no.

When you convert your original sounds into game-ready sound files, the Conversion ShareSet will define channels, audio codec, and so on. If you've set it to stereo (or it's already a stereo file and you set it to "as input"), it will result in playing 2x channels in-game and you can use your Attenuation ShareSet to "Spread" out the channels when close to the emitter. When no spread is applied, your channels will be played from exactly the same place, meaning it'll be perceived as a mono source, but any filtering, effects, etc. will be applied twice - 1x for each channel.

That said, you could probably play both a 2ch stereo sound and a 1ch mono sound on the same game object, use the new Spatial Audio Attenuation curves to control which one is being played, and let the virtual voice system control which sound will get processed.
by Mads Maretty S. (Audiokinetic) (39.8k points)
edited by Mads Maretty S.
...