During our recent Interactive Music Symposium, Joe Thwaites dove into how his team used interactive music design systems in Sackboy: A Big Adventure to help elevate the game’s scores. Below, you’ll find a summary of key points from his talk.
Expanding the Core DNA
"The goal of the music is always to support the story, and interactive techniques are really great tools for helping us achieve this goal," says Joe. When creating Sackboy: A Big Adventure, the audio team decided to build on the musical legacy of the LittleBigPlanet titles, specifically the combination of licensed tracks and original pieces. Only this time, they wanted both licensed and original pieces to feel "equally as embedded and interactive." To them, this meant thoughtfully making use of interactive techniques which supported the narrative of the game, such as horizontal resequencing, vertical layering, stingers & embellishments, and runtime processing effects.
Let’s dive into how Joe’s team used these basic systems in Sackboy: A Big Adventure.
Horizontal Resequencing
Horizontal resequencing is an interactive music technique where music is dynamically pieced together based on a player’s actions. For Joe's team, this meant splitting each track into loopable chunks, and it allowed the music to move to a new section or repeat a previous section depending on what the player did. They used this technique whenever they wanted to:
• Help create momentum
• Change the mood
• Add a cadence to the music
• Change seamlessly to a new piece of music
Let's dive into a few examples:
Vertical Layering
Vertical layering is an interactive music technique in which layers of music can be added or removed to:
• Add variation
• Change the mood
• Build or release tension
• Support narrative development
Joe’s team split each track on a case-by-case basis, but they found themselves generally splitting the music into the following layers: drums, bass, lead, accompaniment, and vocals. Depending on gameplay events, they would add or remove layers to help elevate moments that supported the narrative. Let's have a look at some examples of this:
Stingers & Embellishments
Stingers & embellishments are short musical elements triggered by gameplay events. "These are great for highlighting specific moments of gameplay instantly, and giving feedback to the player when they do certain actions or collect certain things," says Joe. His team used these in Sackboy to:
• Reward the player
• Support gameplay mechanics
• Smooth transitions
Here are a few examples of this:
Runtime Processing
As well as making actual edits to the music, Joe’s team looked at applying filters and effects at runtime that manipulated the music depending on gameplay. They did this to:
• Embed the music into the game world
• React to gameplay states
In the following video, you'll hear some of these runtime processing effects:
Conclusion: Scoring Gameplay to Support the Narrative
Before planning to explore the use of interactive music techniques, it's vital to consider how those techniques are going to support the narrative of the game. For Sackboy, Joe's team wanted the music to support momentum through the level, so they decided to use horizontal resequencing to dynamically change the music as the player progressed. They wanted the music to build up and into narrative moments and boss fights, so they decided to use vertical layering to help build tension and drama. They wanted the music to reward the player when they made progress towards the goal, so they decided to use stingers and musical embellishments to give feedback. And because they wanted the music to feel embedded in the game world, they decided to use runtime processing effects to react to gameplay states. "In a game like this," says Joe, "there's so much that has potential to feed the music system, it is important for us to be guided by the journey of the player within the context of the game."
You can watch Joe Thwaites' full talk here:
コメント