menu
 

Audiokinetic의 커뮤니티 Q&A는 사용자가 Wwise와 Strata 커뮤니티 내에서 서로 질문과 답변을 하는 포럼입니다. Audiokinetic의 기술 지원팀에게 문의하고 싶으신 경우 지원 티켓 페이지를 사용해주세요.

0 투표
I need to play the sound in stereo when the listener player and the emitter of sound are both in one area (for example, in one room) and play sound in mono, when the sound emitter and listener are in different areas, separated by some geometry (for example, when the listener hears sound occluded by the wall). Is there any way to do so automatically for all sounds being occluded?
I'm using unreal engine 5.1 and Wwise 2022.1.2.8150
General Discussion Anton Z. (150 포인트) 로 부터

1 답변

–1 투표
 
우수 답변
Hey Anton.

The quick answer is no.

When you convert your original sounds into game-ready sound files, the Conversion ShareSet will define channels, audio codec, and so on. If you've set it to stereo (or it's already a stereo file and you set it to "as input"), it will result in playing 2x channels in-game and you can use your Attenuation ShareSet to "Spread" out the channels when close to the emitter. When no spread is applied, your channels will be played from exactly the same place, meaning it'll be perceived as a mono source, but any filtering, effects, etc. will be applied twice - 1x for each channel.

That said, you could probably play both a 2ch stereo sound and a 1ch mono sound on the same game object, use the new Spatial Audio Attenuation curves to control which one is being played, and let the virtual voice system control which sound will get processed.
Mads Maretty S. (Audiokinetic) (40.2k 포인트) 로 부터
수정 Mads Maretty (Audiokinetic) 로 부터
...