Written by Ashton Mills

In games, characters have multi-sensory experiences. The characters in a game use all of their senses to experience their world. Players on the other hand only have 2 modalities with which to partake in those characters’ experiences: vision and hearing*.
As audio designers we bring games to life with sound not just by making things sound the way they would in real life, but by ‘storytelling’ – attempting to convey what the characters feel. ‘Feel’ here is most often interpreted as referring to their emotional state, but I believe that sound designers, whether or not they are conscious of it, are also attempting to convey feeling in a more literal sense: the experience of physical sensations.
In real life, if I picked up a sword and swung it, not hitting anything, it likely wouldn’t make any noise. If my character swung a sword in a game and it didn’t go ‘whoosh’ I’d probably think there was some kind of problem, something missing.
In real life, if I picked up a sword and swung it, not hitting anything, it likely wouldn’t make any noise. If my character swung a sword in a game and it didn’t go ‘whoosh’ I’d probably think there was some kind of problem, something missing.
This expectation to hear something that in real life would be inaudible likely comes from my history of similar experiences where characters swing weapons in games and films. But how did that become a precedent in the first place? I believe there are several reasons that there are cinematic precedents for the fabrication or exaggeration of sound for things that are realistically inaudible or very quiet, and one of these is because in games and films we need to portray the experiences of all the senses using only two of them: we need to be able hear things that in real life we would experience with other senses.
One of my biggest inspirations for thinking about this was watching the Unreal Engine 5 demo, in which the character is exploring an ancient tomb.
There is a section where she is climbing through a narrow passage in the cave and we hear, in beautiful detail, the scraping, debris and footsteps of her squeezing through, the slapping of her hands on the cave wall, and the effort in the breath and voice. What made this stand out to me was that the demo itself is all about realism: it’s there to show off the incredibly powerful systems that make the virtual world incredibly life-like: the lighting, the image resolution, the spatial audio and the convolution reverb. Then in contrast to the realism we have exaggerated, cinematic sound design. All of these sound elements are considerably louder and more textured than they would be in real life. But this style of sound design paradoxically makes it feel more real. The reason for this is because through sound we ‘feel’ the character squeezing through, we feel the movement of her body against the rocks, the cool stoney texture of her hand on the cave wall, and her care in supporting her balance.
Thermoception
Thermoception is our sense of temperature, of hot and cold.
When our character walks out of a warm building into the cool air outside, they would experience a change in temperature. The sound designer could communicate this by creating a contrast in the ambient buzztrack between something stuffier and deeper inside versus an open and airy outside ambience with more high frequency content. The sound of wind, maybe even the sound of wind blowing against the character and flapping their clothes would enable the player to ‘feel’ the contrast between the warm and the cold.
Nociception
Nociception is our sense of pain. Imagine our character is fighting a troll wielding a big boulder. The troll smashes the boulder into our chest and we take a lot of damage.
If this is a first person game and we can’t even see our player character then the need to express the perception of pain through sound is especially important
In addition to the touch sensation of the boulder against their body, the character would feel a lot of pain. If this is a first person game and we can’t even see our player character then the need to express the perception of pain through sound is especially important. When the boulder impacts, alongside the crunchy smashing of the rock and a bit of bone crunch we might hear an explosive thump (maybe literally using a sample of an explosion), a low pass filter could be applied to all in-game sound for a few seconds and we could hear voice emotes of pain and heavy breathing from the character.
Popular on A Sound Effect right now - article continues below:
Equilibrioception
Equilibrioception is our sense of balance. Let’s say our game is set on a ship that is caught in a storm and is rocking aggressively from side to side, seriously messing with our character’s sense of balance. When the ship lurches, we may hear great creaks of the wood with thumpy impacts on the peak of the lurch, alongside low whooshing that syncs with the roll rotation of the camera. Breaking up the established rhythm of the footsteps and foley would help to portray the character’s struggle to regain balance.
Proprioception
Proprioception, also known as kinesthesia, is the sense that enables us to perceive the location, movement and actions of our bodies. We can walk without looking at the ground, scratch our heads with eyes closed, and judge the amount of force required to lift or push an object thanks to our proprioception. I believe this to be the most crucial sense to simulate through auditory information because it is the primary modality that our characters use to interact with their environments.
When our character swings a sword, they will feel the weight of it and use kinesthesia to apply force in the right way to drive towards a target. Sound designers will make the sword go ‘whoosh’ as it glides through the air. The shape and frequency content of the whoosh reflect the size and weight of the sword. A thumping layer sound of sword impact with an enemy portrays the force of resistance against it.
I believe proprioception to be the most crucial sense to simulate through auditory information because it is the primary modality that our characters use to interact with their environments
Another example could be the movement of a character climbing up a ledge. Even very strong and athletic characters make a racket with their voice emotes when climbing, and cloth foley and debris sounds will often be really hammed up, both of which help to communicate the movement of the body through space.
Touch
Our sense of touch gives us a great deal of information about the material properties of the things we interact with and it grounds us in the world. Exaggerated foley really helps to portray touch in games. When we can hear the texture of objects and materials we can much more easily imagine what it is like to touch them. The above example of the Unreal Engine 5 demo is a good example of this.
Another example is the way that we often hear the high pitched ringing of metal in games when there is a sword or knife, even when it hasn’t been struck or impacted with anything. The ringing of the metal helps the player imagine what the blade would be like to touch.
Taste and Smell
These two present much more of a challenge as it’s much less obvious how to cross from the olfactory domain into the auditory. One of the most common occurrences of taste in games is the consumption of potions to replenish health, mana or to apply some kind of state. One way we can portray the character’s sense of taste when consuming a potion is through the details in the bodily sounds. A fast and vigorous gulp followed with a satisfied, airy, open-mouthed outward-breath (or perhaps even a belch) could communicate a positive taste experience, whereas a more laboured and drawn-out swallowing, with visercial throaty layers and followed-up with a sharp, round-mouthed outward-breath, maybe even some low throaty vocalisation or a cough would communicate an unpleasant taste.
One way we can portray the character’s sense of taste when consuming a potion is through the details in the bodily sounds
I think if the script of the game makes some reference to smell and this is accompanied by convincing ambient audio it can be possible to convey a scent. For example if a character is in a cave and voices some comment about the smell of damp and we get a lot of stuffy cave ambience with the sound of water dripping, through the combination of these we might well be able to imagine the smell.
In summary there are a multitude of ways that sound designers can use audio to fill in the sensory blanks that are inherent to a medium which delivers information only across 2 senses. For me personally, using this lens for thinking about sound design can really stimulate creativity, and can provide some relief in the form of an alternative to the emotional lens in sonic storytelling, as well as answering some of the questions around why we do what we do and why certain creative decisions feel natural or unnatural.
* I have not included haptic rumble as a player modality. While this of course adds a great deal to the game experience, it doesn’t really count as ‘touch’ because it only offers 1 of the billions of potential touch experiences a person can have. It is the sonic equivalent of bleeps and bloops.
A big thanks to Ashton Mills for the insights on how game audio can help to convey sensory information for a much richer gaming experience, and to Ali Tocher for editing!
Please share this:
Ashton Mills is an Audio Designer at Jagex, heading up the combat sound design on its up-and-coming next-gen Action RPG.
-
50 %OFF