Version

menu_open

Custom scheduling of audio rendering

By default, the Wwise Sound Engine does all its command processing and audio rendering in a dedicated thread named AK::EventManager, controlled by the AkPlatformInitSettings::threadLEngine parameters. Calling AK::SoundEngine::RenderAudio signals the end of a game frame and allows the thread to consume all API commands since the previous call to RenderAudio.

Setting AkInitSettings::bUseLEngineThread to false disables this thread, and causes RenderAudio to synchronously do command processing, and audio rendering if needed. The actual rate of audio output remains controlled by the audio endpoint. If the RenderAudio call interval is shorter than the buffer period determined by AkInitSettings::uNumSamplesPerFrame and the output sample rate, some calls to RenderAudio will skip the audio rendering portion. Conversely, if the RenderAudio call interval is longer than the output buffer period, RenderAudio may process more than one buffer at a time, causing a CPU usage spike, and may eventually cause the audio to stutter.

When disabling the audio rendering thread, synchronous AK::SoundEngine::LoadBank and AK::SoundEngine::UnloadBank API calls must not be done from the same thread as the caller of RenderAudio: those calls may block until an audio buffer is rendered to complete Stop operations and free SoundBank media, which won't happen without a concurrent call to RenderAudio.


Was this page helpful?

Need Support?

Questions? Problems? Need more info? Contact us, and we can help!

Visit our Support page

Tell us about your project. We're here to help.

Register your project and we'll help you get started with no strings attached!

Get started with Wwise