Audiokinetic's Community Q&A is the forum where users can ask and answer questions within the Wwise and Strata communities. If you would like to get an answer from Audiokinetic's Technical support team, make sure you use the Support Tickets page.

0 votes
My game has music tracks overlayed with MIDI so my systems may have "curated insight" into them. I have MIDI event callbacks working nicely and it's been a blast! However, things are falling a bit short of what I really want, which is for my systems to know about these events (mostly the notes) ahead of time so they may better utilize them as a dependency. Am I missing anything here?

For example, I may want a specific MIDI note to trigger a gameplay event that has an animation leading up to it. If I can only react to MIDI events "live", this is harder to achieve. Or maybe I'd like the duration of the note to be significant in some way, but again without that "live" constraint. I know the MIDI spec doesn't really account for duration, but we're parsing these notes from files and showing them in the Wwise editor already, so it's certainly under the hood somewhere! I briefly considered non-MIDI eventing to help with this stuff, but I want to utilize velocity, frequency, channel, program changes, etc. pretty much everywhere. Nothing else really makes sense.

For now, I've unblocked myself with some hacky stuff: I play a duplicated track one measure ahead of time and use it as a "future event buffer". Since I know the duration of a measure thanks to the music segment info, I know when the "real" notes will happen. I don't believe this will be sustainable as more complex interactive music content is implemented, and it defeats the purpose of using Wwise in many ways anyway.

Here's to hoping I just missed something! Otherwise, I think this is a great case for a feature request. Thanks for reading!
in General Discussion by Bahamuto (100 points)
edited by Bahamuto

Please sign-in or register to answer this question.

...