Written by Christian Poessnicker on September 2021
As there was a lot of interest in my article about music production with game engines I decided to translate the article into english:
Ok, free is a little over the top because you need to spend a little bit of your time. But this method is second to none regarding flexibility and options.
The idea is quite simple: computer game engines simulate acoustic waves and their reflexions on walls and objects (wavetracing). The thought behind this is that sound objects can be placed in a 3d world which then are able to reveal their position and the room properties (Hall, Cave, Theater, etc.)
This helps gamers to orientate in the world, localize enemies or objects and immerse themselves into the game world.
So why should you think of using this in your music productions?
Well, if you want to use reverbs in your productions you are normally stuck with two options:
One the one hand, you could use algorithmic reverbs. In this case the reverb will be, as the name already reveals, calculated through mathematical algorithms.
One the other hand, you could use impulse response reverbs. Here the room answer of a real room is recorded as an impulse and can be calculated onto another signal.
The advantages of algorithmic reverbs are their flexibility and the sound shaping options. Never ending reverbs to short slap reverbs literally everything is possible! But in most cases, these reverbs can not hide their artificial character, which isn’t necessarily problematic because music production is rarely about realism.
If realism is the main goal there is nothing like impulse reverbs. The room answers are normally absolutely convincing and can be used without a lot of tweaking. The downside is the lack of flexibility.
Here comes the game engine:
(a hall created in unreal engine 4)
It is possible to design a room in-engine and apply the phonetic properties to the walls.
Here you can choose different materials. Depending on that the room will sound according to the shape, size and materials.
(phonetic materials menu of the steam audio extension in unreal engine)
But it doesn’t stop there as the position of the sound source will also affect the sound.
Is the sound source located in the left corner in the room it will sound different to the sound source located in the right corner, as the reflexions on the walls are different.
So, you are combining the advantages of an impulse response reverb (“real acoustic room”) with those of an algorithmic reverb (maximal flexibility)
Now we have a song in our favourite daw and want to “space” things out.
The first thing to do is to model our room in the preferred game engine. In our case unreal engine 4. It is not important to make it visually appealing as the materials of the walls and the shape of it are much more relevant. You can become as creative as you want because unsymmetrical rooms with unusual materials can create an amazing spatial experience.
Next you bounce all of the tracks that you want to place in the room. Those wave files can easily be dropped in the room as sound objects.
(meshview in ue4)
Once everything is placed you need to place yourself, the listener.
Now you can record all tracks at the same time, or you can go through them one by one. Alternatively, you can record positional impulses to give you the most flexibility to further shape your sound.
(sends with loaded impulse responses in nuendo)
(reverence plugin with ir)
This way opens up additional usecases in ambisonic mixing. Especially when mixing in dolby atmos the plugin list shortens quite a bit.
With the possibility of game engines to output multiple channels (like the 7.1.4 format) your reberb options can be enriched.
Here is an audio demo with a vocal stem:
For more informations you can watch this video: