There is a lot happening here at Audiokinetic and, not just in development but, in the extended community of Authors wielding Wwise in their development pipelines across the globe. There never seems to be a shortage of inspiring activity and incredible insights to be gained on the art of interactive audio. Which is why we started the Wwise Up On Air series of livestreams: To bring interesting perspectives from different corners of the industry and surface some of the cool things being created by, and for, the community. With that intention, the format that would become Wwise Up On Air was piloted with the help of my first special guest, and Audiokinetic Developer, Thalie Keklikian. Thalie has had experience as a streamer and made for a perfect co-conspirator as we began to formulate a strategy to deliver something interesting to the streaming platform. We discussed different approaches, dug-in to the tech, and planned for the first attempt. The initial stream in August 2019 turned out to be a ton of fun: we covered some news, kicked off the Community Spotlight section, and folks showed up and engaged with questions and moments of comedy ensued. While the episode fell-off the Twitch timeline before I was able to highlight it, expect more collaborations with Thalie in the future.
With some initial learnings under our belt, and support during the podcast from the team at Audiokinetic, we were ready to schedule the September installment with someone local to the Audiokinetic Seattle office that I work out of. As a long-time Wwise user, freelance Sound Designer Andy Martin was a perfect fit for the improvisational interview-style format that was emerging! Having known Andy for years, and at one point having lent a hand on one of his previous project, I knew we could cover some interesting ground while hoping to bring something of value to the audience. With his background in AAA games, and more recently, VR and Location Based Entertainment, the idea was off-and-running and we quickly started brainstorming ideas. One idea was for Andy to take folks on a virtual “Sound Walk” through some of the most memorable environments recorded as part of his Northwest Soundscapes Project. Not only that, but he took us through the process of building a randomized bird-call system in Wwise that produced endless variations through the use of parameter-driven Switch, Blend, Sequence, and Random Containers.
October saw the Wwise Tour landing in Seattle and, along with presentations from James Nixon on Crackdown 3 and Tencent on their newly integrated GME Voice Plugin, Kevin Bolen and Bill Rudolph from Skywalker Sound were delivered to the AK Seattle office for a livestream roundtable with Audiokinetic Head of Product Simon Ashby. What ensued was a conversation that ran-the-gamut from a high-level understanding of spatialization techniques to musings on the nature and direction of authoring spatial audio into the future. Catching them fresh before giving their presentation later that night on the sound behind War Remains, Ralph Breaks the Internet, and Vader Immortal allowed for a longer-form discussion, and engagement with folks who tuned in, allowing for a deep-dive into some of the decisions and thought going into the amazing work being done across the emerging platforms in XR. Their presentation at the Wwise Tour brought an incredible perspective into the rich legacy of sonic-storytelling at Skywalker Sound and showed a level of artistry across the titles they presented on that makes it a must watch in the archive.
In November I found myself onsite at the Audiokinetic Montreal office and eager to sit down with some folks during my time on the ground. First up was Audiokinetic Director of Wwise Experience Bernard Rodrigue who joined me on the first installment of the “Hands On” series of Wwise Up On Air. The Hands On segments allow us to take a feature of Wwise and expose it in a dynamic way that shows the accessibility and depth available in a casual walk-through-type overview. Bernard provided an overview of the first steps to getting started with the Wwise Authoring API (WAAPI), an interface used to communicate with the Wwise authoring application. One thing covered was the use of an available WAAPI script to enable the simple export of a .wav file from a web-based audio synthesis tool into Wwise. Additionally, we explored the automation of key-map assignments for a folder of imported .wav files meant for use by MIDI files in the music system. This episode also attempted to illustrated how easy it is to get started with WAAPI and ways it could be worked into your development pipeline. We even walked-through the process of connecting a keyboard controller and triggering samples using Control Surface Devices.
The following week I was able to sit down with local Montreal Sound Designer Beatrix Moersch to discuss her career and leverage folks tuned-in to the stream towards presenting her with a sound design challenge. We started-off talking about her work in games, film, and VR as she unraveled a web of creative decisions that were empowered by a deep understanding of the technical consideration in-balance with creative vision. We followed-up that conversation with the first-ever Wwise Sound Design Challenge! After polling the audience for potential sounds, Beatrix jumped into the content provided by the Wwise Audio Lab and the various plug-ins at her disposal and proceeded to design footstep sounds for a giant mech. Watching (and listening) to her process, starting from scratch to create something out of nothing, building a multi-layered sound out of a combination of Blend & Random Containers, Events, and Game Parameters, with a healthy dose of Sound Seed Grain and Convolution, was an enlightening experience. Watching the deft-navigation of a foreign Wwise project towards the creation of a larger-than-life set of footsteps using only the tools available in Wwise was like unpacking the creative process in realtime. Samples were mangled, parameters were employed, and grains of sound were warped to the whim of Beatrix's sonic vision. The session ended with my acting-out the movement of the imaginary mech to a chorus of pitching solenoids and gigantic footstep impacts.
December brought another episode of the “Hands On” series of Wwise Up On Air with Mads Marietty getting us on the road to development with the Unity-based Wwise Adventure Game (WAG). Mads led viewers through the process of opening WAG from the Wwise Launcher and then proceeded to peel-back the layers of the Unity Project while revealing some of the usability features implemented to make development easier. Things like accessibility within the game menu that allow for quest completion, undamaged states, and teleportation between certification chapters and in-game locations provide an easy way to navigate the depths of the game project. We also drew direct connections between functionality across the Unity and Wwise projects and uncovered a sandbox and possibility-space that put simple and advanced techniques for game audio well within reach. Focused on surfacing the Wwise Project Adventure, Wwise Adventure Game, and Wwise Certifications is just the tip-of-the-iceberg when it comes to accessing the wealth of information provided to gain a greater understanding about Wwise and how game audio is integrated. It was great discussing these resources and bringing some of the functionality provided by them.
We’ve fallen into a pattern on the Wwise Up livestream of focusing on news about Wwise from the different channels we manage including (to name a few) the: Blog, Certifications, Community Events, Powered by Wwise, and our YouTube channel. The livestream also tries to pull-together interesting, interactive audio related, things from around the internet, things like the insights, tutorials, and materials that Wwise Authors are surfacing across the web, and focus on them as part of the Wwise Community Spotlight. Everything from videos on the specifics of Wwise and Unity, experimental plugin suites available to download, and workflow optimizations and techniques that extend the functionality of Wwise in new and exciting ways. These passion and project-driven tips, tricks, and resources are at the foundation of the interactive audio community and I love learning about how people are extending their creativity with Wwise.
Finally, I want to take a moment to say “Thank You” to all of the folks I’ve spoken with, both on the livestream and over the course of the last year, as I’ve gotten settled into my role at Audiokinetic. An important part of the road ahead are these conversations with Wwise Authors, gathering insight, perspective, and feedback around the ways that Wwise helps empower creative interactive audio choices. It’s my goal to serve as a bridge between folks channeling the power of interactive audio and Audiokientic as we build towards the future of authoring tools to help enable an accessible way to realize your audio visions.
Tune in January 30th at 2pm EST, for an exploration of the soundscapes in the game GRIS with special guest Rubén Rincón while we play-through and unfold the Wwise project to get a clearer understanding of how sound was designed and implemented. From the subtle beauty of well designed content to the reinforcement of gameplay through interactive music, the sound of GRIS is a study in dynamic sound executed with Wwise in January on Wwise Up On Air.
Commentaires