Blog homepage

Playing an iconic puzzle game to the beat

Game Audio

More than a year ago, when I was called to work on a new game, I didn't expect it to be for the well-known franchise Tetris®.  My colleague at Amber, Rodrigo Ferzuli, suggested using Wwise to create several interactive systems that would emphasize the concept of this new game, a fusion of rhythm games and Tetris.  Here’s exactly how Tetris Beat works: it’s a game that asks for clever allocation of tetromino-shaped puzzle pieces to clear complete lines of blocks (is there anybody that hasn’t played it yet?), with the engaging action of making all the moves "on the beat", while missing the fewest number of opportunities to achieve score combos. Ideally, the user enters a flow state while being stimulated by modern music or catchy songs.

 

Pieces Start Falling into Place

One of the challenges was having a musical grid to determine the accuracy of all inputs and control animation events musically. This task was taking too many development resources, but everything started falling into place once we integrated Wwise.

First, a series of Callbacks is sent from Wwise and caught by a script. Then, these are compared to the user input to determine whether a tetromino move was made on the beat or not. This is done using the following AkCallbackTypes:  AkCallbackType.AK_MusicSyncBeat, AkCallbackType.AK_MusicSyncBar and AkCallbackType.AK_EnableGetMusicPlayPosition. It’s important to note that there’s a window of time during which a user’s input can be considered "not great, but close enough to the grid". If the input is acceptable, then a positive sound effect or musical stinger that harmonizes with the specified song is played; otherwise, it triggers a negative state and the corresponding “not rewarding” SFX.

VFX Responsiveness

For each level of our beloved game, there is a soundtrack curated from a list of several artists and tempos. There are many visual aids that help users follow the beat, most of them driven by the basic sync beat Callback. For example, the Matrix (the space on which the game is played) glows with varying intensity, marking the beat like a metronome. Other rhythmic visual elements dance on the stage in the background, reacting to audio frequencies. Along with the rest of the audio team, I looked for ways to send information back to Unity game engine so technical artists could hook up to an RTPC value that would work dynamically for each song in real time. This saved them from animating hours of visuals synchronized with the music, and made it possible to meet tight deadlines when new levels were added. We also used the Wwise Meter plugin. This element works as an envelope follower and allows its gain analysis to be assigned to an RTPC. Then on every song we routed some Aux channels to a “no output” bus so that it wouldn’t duplicate the volume; each of these Aux channels have LP, HP or Bandpass filters to separate frequency bands in the way a cross-over or multiband processor works to limit frequency ranges caught by each RTPC. Plus, we added single “ghost” tracks in some cases, to isolate elements like vocals, bass or drums that were important to represent visually. 

tetris1

Tetris ® & © 1985~2022 Tetris Holding.

Working with isolated “ghost” tracks allowed us to exaggerate their attack to be more responsive and even to apply volume envelopes and LFOs over a white noise track to design buildup ramps, pulsating breaks, rhythmic phrases and make the meters move in a desired way. The values that go to designated RTPCs can be calibrated inside the Meter properties to limit the dB range and send only useful data to VFX artists. Also, make-up gain can be applied to these “ghost” tracks before they’re delivered to the Unity game engine, so developers or technical artists get a constant 0 to 100 value every time the given frequency range or instrument amplitude envelope is playing.

tetris2

Tetris ® & © 1985~2022 Tetris Holding.

Spatial Audio

With the arrival of new headphone products equipped with accelerometers and binaural renderers of multi-channel content, we were presented with the possibility to experience the game as if we were seated in the center of the action, with the game’s Matrix board in front of us and sound coming from all around.

Another option was to fight the rigidness of a static player contemplating a fixed game area. Engineers figured out how to connect some functions with accelerometers to allow head position values to be translated to Wwise listener rotation. When the corresponding option is activated in iOS, the audio content “moves” in real time as the user turns their head, thus panning the audio to give the sensation that the sound stays on the screen even when looking to either side. This also gives players the ability to "follow" any panned sound that happens in the surround field. Any usual attenuation parameters like filters, spread and volume are controlled with Wwise. 

To enhance the audio content, we prepared special stems to play in the rear field whenever a surround system, soundbar or simulated headphone surround is present. Some tracks, generally those with music content that already moves around the stereo field, naturally translate well to play on the rear sides. Other song elements, like transitions, can be automated to make a nice circular feeling in a surround environment. The rest of the work is performed by the binaural rendering included on user devices thanks to the “Spatial Audio” feature built into iOS.

Hard Drop Stingers

Playing the same "positive" sound every time a user scores on beat was not ambitious enough for a game that has such a diverse range of music provided by artists from different parts of the world, so a set of special sounds that went well with each song was commissioned. For this, we used Wwise Containers with a variety of musical phrases tuned to a certain scale, percussion sounds or elements from the song which were assigned to play during the same action but set as Wwise States to change at specific song segments. Sometimes you can hear a sequence of sounds playing a melody as you complete a Combo or a series of percussion, so users feel like they’re filling in the musical arrangement at the same time they’re scoring.

tetris3

Real Time Effects and Parameter Controls (“Fever Time”)

“Fever Time” works as a climax for the songs and is triggered every time the user collects a number of correct inputs. To the player, this is perceived as  “momentum”, similar to when a superstar DJ bypasses the LF channel in the mixer and starts creating space with effects and rhythmic aids. We delivered this solution using an Auxiliary send with audio coming from the music track into a chain of Effects; as the send volume rises, visuals heighten the experience for the user. And when this bonus time is over and gameplay goes back to its normal mode, there is a feeling of a “music drop” as audio plays again at its full bandwidth and the “spacey” FX goes back to a dry setting.

Dynamic Hard Drops Sound Procedure

tetris4v2

Tetris® Beat is now available on Apple Arcade: https://apps.apple.com/us/app/tetris-beat/id1536485727

Rodrigo_Ferzuli_final_circle

 

RODRIGO FERZULI

Rodrigo is an Audio Engineer from Mexico City, currently working at Amber - Game Development & Studio Services Agency, where he has developed Game Audio for projects like Tetris® Beat and others. He also serves as an official IGDA Audio Mentor and is the Co - Founder of the Game Audio Latam Community. As a result of these efforts, Rodrigo has earned a place in the 2021 TGA Future Class.

Uriel Orozco

Uriel Orozco

Works as Audio Lead at Amber Video Games for Mexico area. He has up to 15 years of experience working at recording studios under several roles as recording engineer, mixer, producer and arranger; also has a special affinity for synthesizers and sound processing techniques. As a musician he has composed tracks for traditional media like Movies & TV and has toured with several projects in Mexico and the United States.

Comments

Leave a Reply

Your email address will not be published.

More articles

Unleashing the Wwise Integration in Unity: Wwise-301 Certification is live!

Remember the Wwise Adventure Game? Well, we just launched Wwise-301: Wwise Unity Integration, a new...

27.11.2018 - By Mads Maretty Sønderup

How Sound Designers Use PureData + Heavy to Develop DSP Plug-ins - Part 2

In Part 1, I explained how to create a “blueprint” using the Patch file. Now let me show you how to...

15.10.2019 - By Chenzhong Hou (侯晨钟)

Tell Me Why | Audio Diary Part 3: Sound Design

The Audio team for "Tell Me Why" had lots of opportunities to enhance unique and memorable narrative...

24.6.2021 - By Mathieu Fiorentini

Making Music Design Choices in Little Orpheus

Little Orpheus is a side-scrolling adventure game about one comrade’s journey to the centre of the...

15.9.2022 - By Jim Fowler

Practical Implementation: Leveraging Wwise to Replicate a Real Radio in Saints Row (2022)

When initially designing systems in pre-production for Saints Row (2022), the audio team decided to...

7.4.2023 - By Brendon Ellis

Sound Haven Highway | Step Up Your Sound Game Jam

This article was written following the second annual Step Up Your Sound Game Jam, an Android mobile...

26.5.2023 - By Petricore

More articles

Unleashing the Wwise Integration in Unity: Wwise-301 Certification is live!

Remember the Wwise Adventure Game? Well, we just launched Wwise-301: Wwise Unity Integration, a new...

How Sound Designers Use PureData + Heavy to Develop DSP Plug-ins - Part 2

In Part 1, I explained how to create a “blueprint” using the Patch file. Now let me show you how to...

Tell Me Why | Audio Diary Part 3: Sound Design

The Audio team for "Tell Me Why" had lots of opportunities to enhance unique and memorable narrative...