Layered music with Metasound

In a previous post I was discussing how to use the Sound Mixer to handle different effects as the game transitions between game modes.

However, I wanted to experiment a bit with the possibility of using layered music for Exploration and Combat.

Disclaimer: I’m not a musician, I have a very primitive understanding of musical theory, enough to know I know nothing.

Imagine you have a musical composition that loops seamlessly. This composition has a base layer with some percussive elements and bass, enough to give it a clear root, on top of that you have two different options, a Combat layer that has stronger percussion and brass, harsh sounds with a martial feeling, and an alternative Exploration layer that shares the same melody, but with soft strings and woods. The Sound of Silence, by Disturbed and by Simon and Garfunkle, but with a shared drumbase.

Understanding MetaSound

To be fair, I don’t entirely understand MetaSound yet. Unreal MetaSound is a programmable audio system, that allows to process waves (let’s say audio files, although you can synthesize waves too) through a series of operators, like a pedal chain maybe? and route them to an audio output.

Unreal sees a MetaSoundSource as if it were a simple audio file, but inside there can be any number of rules in a processing pipeline.

I have used a few rules, for instance I have a rule for selecting random footstep sounds from a list. Each time a character’s foot hits the ground, it plays the MS_Step source, inside there’s an array of different audio clips, it selects one at random and plays it with a little pitch variation each time.

MetaSound and Layered Music

For this example I got my friend Jaifos to give me a track split in two layers. We’ll call these the Base layer and the Top layer.

Each music file is imported into UE5 and set to loop. Then I made a MetaSoundSource, with two wave assets (audio files) and assigned them to each of the music files. I also added an input of type float (a number between 0 and 1), which will control the volume of the Top Track.

Inputs are data values that can be modified through script. In this case I need a float input to modify the volume of the Top Track.

Variables are data values for internal use only, to modify the behaviour of the pipeline. In this case the two Wave Assets in which each of the music files is assigned.

Now I need to create the pipeline. I need both tracks to play at the same time, stay in sync, and mix their output, but I also need to control the volume of the Top Track, so that I can turn it on and off at will.

The OnPlay event ensures both tracks start at the same time, and stay synchronized. The output of the Top track is multiplied by the TopVolume input (0 to silence that layer, 1 to mix it in full). The result is then added to the Base track to achieve the final mix which is then routed to the final output, in this case a Mono channel.

Controlling the volume by code

Once the pipeline is ready, I have to manipulate the TopVolume input when a specific event happens. In this example the audio starts with only the Base Layer on, once I enter a designated area, the Top Layer fades in, and if I exit, the Top Layer fades out again.

To achieve this I made an Actor blueprint with an Audio Component. This component has my MetaSoundSource assigned and is set to “AutoActivate”

In the logic of this Blueprint I created two custom events, one to fade in and one to fade out.

From here I can just call each of the events whenever I want. In this example I set it to FadeIn once the character enters the WhiteBox in the map, and to FadeOut once the character exits.

The Result

I still haven’t figured out what’s wrong with my OBS. The audio sounds terrible, I hope you can notice the difference. Sorry Jaifos for butchering your music with my crappy video capture.

The Base Layer has Bass and Hi-hats, the Top Layer has keyboards. Notice how the keyboards only sound when the character enters the white area.

I still don’t know if we’ll do this for the game, but It’s nice to know we have this tool and it is so easy to use.

Thanks Jaifos!

Click the image to find my friend’s Sound Cloud page.

The mission system

Bradamante is a story-heavy game, to manage progression through the game we are using a mission system inspired by Legend of Mana. This means a few things:

  • Only one mission at a time: No active side-missions, jobs, chores, etc… we want you to focus on one story beat at a time.
  • Missions can cross multiple maps: Our map is not “open world”, but we want you to be able to travel far and wide and many missions will require you to move from one place to another.
  • Keep your friends close: One feature I really want to implement is the possibility to pull other knights into your party for as many missions as possible. There will be a hub in the game when you’ll be able to talk to NPCs and recruit them to tag along. That means the game needs to be able to remember who’s in your party at all times.
  • Missions are geography-based: Most missions will activate and progress when the player reaches certain point in the map. Reaching a place will be a trigger for most missions.

Building Blocks

To explain our mission system, first I have to introduce a few concepts of Unreal’s base classes.

Player Controller:

At any time there’s a player interacting with the game, there is a Player Controller managing things under the hood. Customizing the Player Controller class is the first step to owning your game structure. Our PlayerController is responsible for handling the party, the playmodes, and the input. When a knight joins the party, it does so through the Player Controller.

At all times any script running can request a reference to the Player Controller and ask it anything.

But bear in mind, each time game changes to a new Level, the Player Controller unloads and reloads entirely. Any information the Player Controller had from the previous level is lost and needs to be rebuilt when the new level loads.

No, we are not using level streaming. Yes, I know this issue wouldn't happen if we were using level streaming.

A neat feature of the Player Controller is that you can actually create custom versions of it and override the default one for specific levels. So you can have weird levels with entirely different behaviours.

Default Pawn:

An Actor in Unreal is an object that exists in a level, has a physical presence and can manage events. A Pawn is a type of actor that can be controlled by a player or an AI.

Unreal allows us to define a “Default Pawn” that will spawn in the level automatically when the game starts, usually this is your player character, but not here, our Default Pawn is the camera manager. You as a player don’t really control one single character, but an entire party, and we want to be able to switch the party at any moment, change the main character at will, remove all members from the party during a scripted sequence, things like that.

This class also unloads completely when changing levels, which in our case is not really an issue because our camera manager won’t hold any information that needs to be remembered across levels. And this can also be overriden on specific levels, which allows for interesting “game changing” behaviours.

Game Instance

One instance to rule them all. This one is created when the game loads and remains active until the game closes completely. This is the class that can handle memory across levels.

In here I can place whatever logic and data I need when moving from one scene to the other.

Game Save

This is the class responsible for representing and storing data. At the time of writing this devlog I haven’t yet implemented this, but I already know what will go here.

Mission Handlers

I have a custom class for each mission, that handles the events for that particular story beat. In these handlers I can program scripted events to my heart’s content, to guide the pacing and story progression of each mission. When a new mission starts, the Game Instance verifies if the mission is still available (hasn’t been completed yet) and if it is, the appropiate handler is created. If the player moves to a different level, the Player Controller asks the Game Instance to rebuild the party and current mission, if any.

The process handling a mission is illustrated by this handy diagram:

In this quick video, a new mission is created when the player hits the New Game button. A new mission handler is created for the first mission of the game, and its first scripted event is loading the Abandoned Keep level where the mission starts. Notice how the characters move on their own for a brief period of time, some dialog, and then the player is given control.

This way of handling things allows us to create missions with very custom events, we can switch characters, start fights, interrupt things midway, trigger dialogues, trigger events by time or location, move characters around the map, ask the player to go to another location, etc… Almost whatever we want, we can do.

Dash, too simple to be true

One of the main inspirations for our game is HADES. Not just the camera perspective, but some of the features in movement and combat are inspired by Supergiant’s Opera Magna.

One of these features we are copying inspired by is Zagreus’ Dash

It’s super snappy, responsive, and very expressive.

In Unreal Engine getting this dash to work is incredibly simple, or so it seemed at first.

The Sweep option makes sure the character won’t go through a wall or an obstacle. This will poll against any object in the way that can be collided with, which is nice, but I want to evade other characters, enemies, and some objects, so before the dash we have to change the way this character interacts with other object types.

Once the movement is over I just have to turn these back to block and it’s done… right?

Well………..

This strategy has one huge downside, any texture in the ground would be registered as a collision, even though the capsule would slide right through when walking. Ok, no biggie, I just have to make sure every floor is perfectly flat and smooth… right? Well…. this also means if I want any vertical movement in the level design, like stairs or a ramp or whatever, the dash won’t work properly on those.

So this solution is too simple to be true, and so it happens I have to come up with something different.

The new dash, now with vertical movement

I had to refactor every single aspect of this dash, from the input to the execution, everything.

Calculating the new position

For the new dash to work on rough surfaces, elevations and steps, I need to calculate a possible point to land. This is achieved by proyecting rays to the ground from a very tall elevation, at different distances in the direction of the dash.

The line trace polls only for WorldStatic objects, ignoring characters and movable objects (I might have to refactor this if we ever want to include moving platforms. Foreshadowing?). It starts in front of the character at the max distance of the dash, and 10 meters high, goes straight down to 10 meters deep. If an object is found, I ask if it has a ground tag, I have several ground tags to change the sound of the steps, if it does have a ground tag we have some floor to land on, if it fails it’ll try again at a shorter distance, up to 5 times.

Let’s say the ground check is successful, we have ground to land on. Then we have to make sure we won’t go through any walls. This is trickier that it sounds, since the new dash needs to be compatible with elevations and steps, I can’t just trace a line from the character to the target point, it might hit a corner of a step and fail. So I trace two lines, one at the elevation of the character and one at the elevation of the target point. I want these traces to fail, I mean, to not hit anything.

In summary, 5 traces from top to bottom looking for ground, two more checking if there’s no obstacle in sight. Here’s how it looks with debug turned on:

The input

This is more a quality of life thing… I can calculate the new dash position at the moment the player decides to dash, but…. what if I can give the player a way to visualize where they’ll land. So I changed the input from on pressed (the exact frame the player hits the button) to on release (the frame the player lets go the button). During the time the player is hitting dash the game is calculating the new position and I made a little thingy that helps visualize the landing position of the dash.

Bradamante’s Plume

In the poem Bradamante is described as wearing white armor and a big red plume. Considering the aggresive top-down angle of the game, the red plume is one feature I really want to achieve. It’ll provide a lot of secondary movement and visual contrast against the background.

To achieve the plume I followed this tutorial:

With Bradamante’s helmet as the base for the hair strands. I made a vertex group with the top rim of the helmet and groomed the strands from there.

Once the strands had the shape I wanted, I exported the file as Alembic with a scale of 1. I’m omitting a lot of details, but if I basically followed the video step by step and changed only what I needed for my particular project.

In Unreal, I enabled the Groom and Alembic Groom Importer, just as the tutorial says. Once the engine has the plugins loaded, I imported the asset and adjusted the hair width, root scale and tip scale to get wide strands with thick roots and narrow tips.

I made a new Material with the Hair shading model, two sided, and marked the Used with Hair Strands option. Specular 0, Roughness 1, Base color with 5 red, 0 green, 0 blue, and a some emissiveness in red.

Final touches and it looks great!

Imported Bradamante’s Helmet, and adjusted it to the character’s head bone. Added the plume as child of the helmet.

And it worked like a charm!

Audio Mixing with UE5

Unreal Engine 5 supports many different ways to manage audio for a game.

In this short blog post I’ll explain the strategy I used for Bradamante in order to get global audio mixing that changes according to Play Mode.

Context: Play Modes?

Bradamante is an action/adventure game, and features several Play Modes, the Player Controller is responsible for changing between modes, which affect which inputs are read, and how several systems in the game behave.

  • Exploration Play Mode would be the basic mode the player is in, most of the time. In here the player can move around and interact with objects and NPCs.
  • Combat Play Mode is active when the player enters a combat area or when the events of a level trigger a fight. Players cannot attack outside of combat.
  • In Dialogue Play Mode the player cannot move, any attack is halted, and the dialogue UI appears on screen.

I want to change the way the game sounds depending on the Play Mode. This includes a few basic features like:

  • Different music during combat
  • Focus on voice during dialogue (we hope to have VA for this game, only time will tell)
  • Muffled sounds and music while using a menu
  • Etc…

The strategy: Sound Classes

Unreal Engine is extremely deep, there are many ways to achieve what I wanted. I want a system that works on any level, with any music and sound effects with the least amount of code involved.

My solution was to use Sound Classes and Sound Class Mixers. I assign a Sound Class to each and every sound I import into the project.

At the moment of writting this, I only have 5 Sound Classes, every sound in the game is set to one of these.

  • Ambient Music is for music that’d sound in the Exploration Play Mode.
  • Ambient SFX is for ambient sound effects.
  • Battle Music is for music that’d sound during combat.
  • Character SFX is for voices.
  • Music is a parent class for both types of music.

The next step is to configure a Sound Class Mixer for each of my Play Modes. Every Sound Class Mixer has particular rules for each sound class.

For instance, the Combat Mixer has 0 volume for Ambient Music and 1 volume for Battle Music

Now, when I switch Play Mode in the Player Controller, I also set the corresponding Sound Mixer. Some are missing at the moment, I’ll add them later.

This affects all sounds playing at any given moment. As long as their Sound Class is properly assigned. This strategy requires some care when adding and classifying sounds, but it saves me a lot of time later when I’m working on a level.

Speaking of which, for the music I only have to add two Ambient Sound actors in my level, each assigned to an audio clip that has the correct Sound Class.

The Result:

Callisto

Callisto

The little exoplanet Callisto is named after a Nymph, the daughter of the infamous king Lycaon and the Naiad Nonacris.

The myth says that Callisto joined the hunting party of the goddess Artemis, and thus she was constantly seen close to the goddess.

Zeus seduces Callisto while disguised as Artemis. Extracted from the Zeus music video by Pascu y Rodri.

Zeus obsessed with the charms of the young huntress and followed her during their hunting trips. Once she was alone Zeus disguised himself as Artemis and seduced Callisto, impregnating her.

Artemis, who demanded all her followers to remain pure, found out about the affair and expelled Callisto from her company. The poor Nymph roamed the forest alone as the pregnancy advanced.

Callisto's demise. Extracted from the Hera music video by Pascu y Rodri.

Hera found about her husband’s infidelity and transformed Callisto into a bear only to trick Artemis to shoot her during a hunt. Zeus felt pity for the Nymph. He sent Hermes to rescue the unborn baby, Arcas, and take him to the Nymph Maia. Then he transformed Callisto into a constellation, Ursa Major, the Great Bear, where she could live save from Hera’s wrath.

Her son, Arcas, succeeded his grandfather Lycaon and his kingdom received the name Arcadia in his honor. On his death bed, Zeus transformed him into a constellation, Ursa Minor, the Little Bear, and put him next to his mother in the sky.

In Mason’s Diary, Ferdinand refers to the planet as a “Sterile hunk of rock”, which probably refers to it being a solid core exoplanet, devoid of any signs of life. Mason describes it as mostly iron and mineral carbon, with traces of copper and other minerals, in a solid compact, almost spherical formation. Some traces of methane rivers and lakes can be identified on its surface, but most of it has evaporated, and it’s missing, probably extracted for future use. By the time the Ziggurat reaches this planet the mining operations are finished, but the planet is still exploitable. Mason has the task to clean it up in case the Mining crew decides to return.

Sources:
https://www.greeklegendsandmyths.com/callisto.html
https://www.greekmythology.com/Myths/The_Myths/Zeus%27s_Lovers/Callisto/callisto.html

Check out the incredibly good music videos about Zeus and Hera from Spanish YouTubers Pascu y Rodri: