Blog homepage

How to Use Interactive Music to Score Gameplay

Community & Events / Interactive Music

During our recent Interactive Music Symposium, Joe Thwaites dove into how his team used interactive music design systems in Sackboy: A Big Adventure to help elevate the game’s scores. Below, you’ll find a summary of key points from his talk.

Expanding the Core DNA

"The goal of the music is always to support the story, and interactive techniques are really great tools for helping us achieve this goal," says Joe.  When creating Sackboy: A Big Adventure, the audio team decided to build on the musical legacy of the LittleBigPlanet titles, specifically the combination of licensed tracks and original pieces. Only this time, they wanted both licensed and original pieces to feel "equally as embedded and interactive." To them, this meant thoughtfully making use of interactive techniques which supported the narrative of the game, such as horizontal resequencing, vertical layering, stingers & embellishments, and runtime processing effects.

Let’s dive into how Joe’s team used these basic systems in Sackboy: A Big Adventure.

Horizontal Resequencing

Horizontal resequencing is an interactive music technique where music is dynamically pieced together based on a player’s actions. For Joe's team, this meant splitting each track into loopable chunks, and it allowed the music to move to a new section or repeat a previous section depending on what the player did. They used this technique whenever they wanted to:

• Help create momentum

• Change the mood

• Add a cadence to the music

• Change seamlessly to a new piece of music

Let's dive into a few examples:

Vertical Layering

Vertical layering is an interactive music technique in which layers of music can be added or removed to:

• Add variation

• Change the mood

• Build or release tension

• Support narrative development

Joe’s team split each track on a case-by-case basis, but they found themselves generally splitting the music into the following layers: drums, bass, lead, accompaniment, and vocals. Depending on gameplay events, they would add or remove layers to help elevate moments that supported the narrative. Let's have a look at some examples of this:

Stingers & Embellishments

Stingers & embellishments are short musical elements triggered by gameplay events. "These are great for highlighting specific moments of gameplay instantly, and giving feedback to the player when they do certain actions or collect certain things," says Joe. His team used these in Sackboy to:

• Reward the player

• Support gameplay mechanics

• Smooth transitions

Here are a few examples of this:

Runtime Processing

As well as making actual edits to the music, Joe’s team looked at applying filters and effects at runtime that manipulated the music depending on gameplay. They did this to:

• Embed the music into the game world

• React to gameplay states

In the following video, you'll hear some of these runtime processing effects:

 

Conclusion: Scoring Gameplay to Support the Narrative

Before planning to explore the use of interactive music techniques, it's vital to consider how those techniques are going to support the narrative of the game. For Sackboy, Joe's team wanted the music to support momentum through the level, so they decided to use horizontal resequencing to dynamically change the music as the player progressed. They wanted the music to build up and into narrative moments and boss fights, so they decided to use vertical layering to help build tension and drama. They wanted the music to reward the player when they made progress towards the goal, so they decided to use stingers and musical embellishments to give feedback. And because they wanted the music to feel embedded in the game world, they decided to use runtime processing effects to react to gameplay states. "In a game like this," says Joe, "there's so much that has potential to feed the music system, it is important for us to be guided by the journey of the player within the context of the game."

You can watch Joe Thwaites' full talk here:

Joe Thwaites

Principal Composer & Music Producer

Sony Interactive Entertainment

Joe Thwaites

Principal Composer & Music Producer

Sony Interactive Entertainment

Joe is Principal Composer & Music Producer at Sony Interactive Entertainment Europe.

 @joethwaites

Comments

Leave a Reply

Your email address will not be published.

More articles

GDC Audio Impressions and Afterthoughts

Looking back on the last couple weeks, wow… March was a non-stop adventure of Game Audio...

28.3.2017 - By Bonnie Bogovich

MUR: Interactive Music for a Children's AR Book & Game on Tablet! 

Creating truly interactive music can be rather hard. One of the reasons is the nature of music....

6.2.2018 - By Jesper Ankarfeldt

Music for Games Should be More than Just Music: Part 1

What is video game music? What is interactive music? The answers to these questions are not as...

6.11.2020 - By Olivier Derivière

Music Outsourcing Workflows

During our recent Interactive Music Symposium, we were joined by Richard Ludlow, Andrea Chang,...

3.11.2021 - By Audiokinetic

Setting up Wwise for a Spatial Live Electronic Performance

This article was originally published on gabrielgallardoalarcon.com.

2.2.2023 - By Gabriel Gallardo-Alarcon

How to Create Audio-Reactive Objects Using Wwise and Unity

I would like to show you how to use RTPCs to move game objects in Unity, and how to create...

9.2.2023 - By Tomokazu Hiroki

More articles

GDC Audio Impressions and Afterthoughts

Looking back on the last couple weeks, wow… March was a non-stop adventure of Game Audio...

MUR: Interactive Music for a Children's AR Book & Game on Tablet! 

Creating truly interactive music can be rather hard. One of the reasons is the nature of music....

Music for Games Should be More than Just Music: Part 1

What is video game music? What is interactive music? The answers to these questions are not as...