Here, Aadahl goes into detail on how they approached the sounds for weapons, vehicles, and devices, what went into the sound of the humanoid robots called Simulants and also the less-advanced robots, how they designed the futuristic sound of cities in New Asia, and much more!
Interview by Jennifer Walden, photos courtesy of 20th Century Studios
In The Creator, Director/Writer Gareth Edwards depicts a near-future in which humanity goes to war over the continued use of artificial intelligence. Humanoid robots, known as Simulants, live alongside the human population. They aren’t typical ‘robots’ but rather sentient creations capable of independent thought and feeling. After a nuclear attack on Los Angeles, which was blamed on artificial intelligence, the Western world decided that AI would be the destruction of all humanity and should be eradicated. But the Eastern world disagrees. New Asia continues to produce Simulants and intelligent robots, seeing them as a valuable addition to their population – even employing fully autonomous robots on their police force.
Oscar nom’d supervising sound editors/sound designers Erik Aadahl and Ethan Van der Ryn at E² Sound – who worked with Edwards on Godzilla – created the sound of the near-future world for The Creator that is a “blend of natural realism, ‘vintage futurism,’ and fully futuristic high-concept technology,” says Aadahl.
It sounds familiar yet futuristic. For instance, the ambiences in the cities of New Asia are a combination of real recordings from Southeast Asian cities layered with designed sounds of new tech advancements and upgrades. They also created new city sirens, traffic crossing chimes, advertising announcements, and other elements to sell the feeling of futuristic cityscapes. They layered together loop group recordings in several different Asian languages and then combined that with binary data “bleeps and bloops” to represent the robot inhabitants.
Here, Aadahl talks about the overall aesthetic for the tech sounds in The Creator and how they used that direction from Edwards to design the wide variety of vehicles, robots, weapons, devices, and aircraft in the film. He talks about creating the sound of the Simulants and robots – how the two are different sonically from each other, and how the sound of the US robots differ from the New Asia robots. He also talks about designing sounds to the mock-ups of Hans Zimmer score, and much more!
The Creator | Official Trailer
What was director Gareth Edwards’s overall direction for the sound of the tech in The Creator? Did he share any references or examples of what he liked (or didn’t like) in terms of the sound of the future technology? Or what were some adjectives he used to describe the overall sound aesthetic he wanted for the technology?
Erik Aadahl (EA): Before production began, Gareth and I met for dinner to discuss The Creator, which at the time he called “True Love.” Gareth pulled out his laptop and showed me the concept art he’d been cooking up for months.
What struck me was this incredibly fresh aesthetic, a mix of old and new.
What struck me was this incredibly fresh aesthetic, a mix of old and new. In that first discussion, Gareth explained this vision of the future, a whole spectrum of technology from the old, ancient, and familiar, to the completely futuristic, and everything in between – visually and sonically.
I think it was only a week or two later that Gareth did an 8 country location scout, which he used to make a 10-minute proof of concept “mood piece.” When he sent us the cut, it felt like a futuristic Baraka, this blend of Terrence Malick and James Cameron.
For the sound, he wanted this blend of natural realism, “vintage futurism,” and fully futuristic high-concept technology. This direction translated exactly into our approach for the whole film.
Let’s dive a bit deeper into the awesome sound of this future tech:
What went into the sound of the USS NOMAD light scanner? (I loved the tonality of this scanning sound!)
EA: NOMAD was one of my favorite concepts in the film, this massive orbital station that projects these blue beams over the terrain for missile targeting.
Gareth described that the beams should sound radioactive. If someone put their hand into the beam for too long, they’d get cancer. It should sound dangerous. Gareth loves making sound effects with his mouth when he has ideas, and he performed this low pulsing hum. When you close your eyes, there should be no doubt that the sound conveys danger and volatility.
Gareth loves making sound effects with his mouth when he has ideas, and he performed this low pulsing hum.
We made a whole batch of sounds, many fully synthetic. Some of my favorite NOMAD flavors were made by sound designer Malte Bieler. Once the visual effects started coming in and we saw the beams animated and re-orienting as they targeted, we could build energy bursts and evolving hums to match. The warping quality, in sync to picture, definitely magnified the danger effect. And I love how effects re-recording mixer Dean A. Zupancic used those sounds in Dolby Atmos, in height and width.
What went into the sounds of the vehicles? Like, the military ‘jet-helicopter’? And the massive US Army tank that rolls into the AI village? And the US military ‘skid steer with arms’ looking thing?
EA: Again, with the vehicles we wanted a blend of realistic and recognizable, and super high-tech and unrecognizable.
For the “jetcopters,” which we see and hear during the first raid in New Asia, Gareth imagined a fast pulsing sound, which he also performed with his mouth. There’s something about that helicopter-thump rhythm that is so reminiscent of war movies like Apocalypse Now. We wound up using a synthetic kick-drum-style-thump, which we modulated and Dopplered, and jet turbines for the vertical take-off and landing moments.
During that raid sequence, Tom Ozanich (re-recording mixer on dialogue/music) did something so out of the box with Radiohead’s “Everything in Its Right Place” in the mix. Combined with that jetcopter pulse, I got goosebumps.
That serendipitous sound, with tremolo, became the tank sound.
The sound of the “Tank” is a funny story. The first scene that the picture department sent us was that tank attack sequence. This was before any VFX had come in, so all we saw was the word “TANK” placed over the production photography. Gareth showed us some concept art, and we realized these tanks were huge. We also noticed that there was this void in the giant treads, so we imagined there was maybe an electromagnetic technology propelling the tank tread.
The sound we wound up using was a happy accident. I was driving from LA to Mammoth for a weekend and veered into the highway 395 median, which is serrated to warn drowsy drivers. The whole car lit up into this big “WAAAAAAAAAAAA!” I pulled over immediately and set up my microphone and recording rig to capture it. I always have my rig with me. That serendipitous sound, with tremolo, became the tank sound. Malte Bieler lined up deep metallic clunks to that rhythm, as a sweetner for the close-ups.
What went into the traffic sounds for the city in New Asia?
EA: For scenes in the cities of New Asia, like Lilac City, we used our own recordings from Southeast Asia as a starting point, such as city sounds from Bangkok, Phnom Penh, Ho Chi Minh City, Hanoi, etc.. We then did the same thing that Gareth did with the on-location visuals and VFX — we layered new technologies over those realistic sonic beds. So for example, new electric engines are peppered in with the old tech of 2-stroke engines.
Gareth asked that they be old technology “souped-up and modified” with more advanced tech.
For many of the vehicles: cars, motorcycles, buses, trains, boats, aircraft, etc. Gareth asked that they be old technology “souped-up and modified” with more advanced tech.
We also created a new palette of city sirens, traffic crossing chimes, advertising announcements, and anything that could add an exotic feeling to the cityscapes.
What went into the subtle sounds of the Simulants?
EA: The whole point of the simulants is that they are sentient, thinking and feeling conscious beings. So how we treated their sounds had to strike a careful balance to be believable and grounded.
Visually, they reminded me of the smooth brass fly-wheels used in magnetic sound dubbers from the analog era of film sound.
Every single simulant movement has a subtle servo motor sound, which we may or may not choose to feature, and varies by character. The simulants also have a void in their heads, between the ears, with wheels spinning as they think. Visually, they reminded me of the smooth brass fly-wheels used in magnetic sound dubbers from the analog era of film sound. It took some time to make the right sounds that didn’t get irritating or intrusive but would register when we wanted them to.
When Alfie processes information, her ear wheels reflect that “gears-turning” thinking process. And when she uses her powers, they accelerate into a building whir that culminates with the organic sound of a didgeridoo instrument that we recorded, as she connects to the technology around her.
What went into the physical sound of the robots? And how did you handle their vocal processing?
EA: The New Asia robots are more rudimentary than the simulants, but still sentient. Visually, Gareth wanted many of them to have a vintage “Sony Walkman” quality, so sonically we did the same.
We carried these robot vocal treatments as live processing through the final mix, so we could adjust infinitely to the very end.
Gareth is a fan of “vintage futurism,” the sound of 1970s analog audio processing like the vocoder, for example. He mentioned robotic vocals from THX 1138 (Walter Murch) and some of sound designer Ben Burtt’s treatments in the Star Wars universe. A lot of what we wound up using was digital vocoders, which we could tweak using MIDI. We carried these robot vocal treatments as live processing through the final mix, so we could adjust infinitely to the very end.
The more complicated communication, using a mix of various languages and computer data “morse code” invented for the police bots, was not live plug-ins and had to be rendered.
In the beginning of this process, Gareth considered processing the simulants’ vocals as well, in particular for Alfie. We did vocal tests, from minimal to extreme, and ultimately decided to keep her character completely pure and natural.
How do the New Asia robot police compare sonically to the US military robots (like the walking bomb robots they use to attack the AI base in the village in New Asia?)
EA: Hah, fun question! So in this universe, AI is banned in the West but is legal and thriving in New Asia. So all of the New Asia police communication uses actual human recordings as the starting point to mimic AI.
The U.S. military robot bombs, we call them G-13 and G-14, are not AI (as that technology is banned), but a more rudimentary and pre-programmed robot technology. We made the decision not to use computer-generated speech in the final version of this movie, but only as reference, to give voice to them. I always get a great chuckle at G-13’s farewell statement, before he runs off like a Kamikaze bot.
Popular on A Sound Effect right now - article continues below:
What went into the sound of the guns? And the rockets? And the massive bombs that NOMAD drops?
EA: The guns took some time to develop. Gareth described them as munition-based but could have a fun sci-fi flavor. He imagined the bullets would use smart-LED tech to light up and streak like tracers. Gareth and picture editor Hank Corwin wanted them to sound “unlike anything” used in sci-fi before.
For the bullet ‘zips,’ we processed sounds like air squeals, actual bullet fly-bys, and different sharp props that we sliced through the air.
We made dozens of different weapon flavors, starting with our recordings of suppressed ammunition weaponry and then twisting them with synth spikes or servo blasts. They needed to sound powerful but “different,” and varied between the New Asia and West technologies. For the bullet “zips,” we processed sounds like air squeals, actual bullet fly-bys, and different sharp props that we sliced through the air. One of my favorite was a studio ID attached to a lanyard zipping by my Neumann microphone.
The tank missiles are also a mix of conventional and unconventional. Some of the tank missiles are baboon screeches, which to me have a visceral and terrifying sound quality.
The NOMAD missiles have a fighter jet base, sweetened with processed tea kettle shrieks to, hopefully, elicit a “freaky, haunting” quality.
There are also fun gadgets and smaller tech – like scanners, projectors, the moving photo that Joshua carries, and the hand-held translator (which has that Speak & Spell sound!). What were some of your favorite sounds for the small devices and tech that are used in the film, and what went into those sounds?
EA: I love those translator devices, and yes the intent was very “Speak & Spell.” Gareth milks them for comedy in a couple of great moments. Again, we used computer-generated text as a reference, and then recorded a wide variety of voice actors as options. We then manipulated those line readings using editing, pitch-tuning, and futzing to simulate a non-sentient computer voice.
What went into the sound of the crowds in New Asia, with its mix of robots, Simulants, and humans?
EA: David Bach was our team’s Dialogue and ADR Supervisor, and he mentioned that this was one of the most complicated films he’d ever worked on.
David Bach was our team’s Dialogue and ADR Supervisor, and he mentioned that this was one of the most complicated films he’d ever worked on.
There are many languages represented, like Thai, Vietnamese, Hindi, Indonesian, Cambodian, and Japanese. Production sound mixer Ian Voigt captured a lot of great stuff on location, but much of the crowd dialogue was done in post.
We had at least three days of both indoor and outdoor “loop group” recording with all these languages. This was used for human and robot crowd scenes, but also announcements, advertisements, and radio communication. Of course, all the robots have their unique processing.
We also created a new “New Asian” language, based on an amalgamation of existing languages but “evolved” by the robots. We often mixed binary data “bleeps and bloops” into this language, and of course, Gareth himself voiced some of it for us.
What was the most challenging scene for sound design? Why? What went into it?
EA: It was all challenging, so that’s hard to answer. The tank battle may be the most challenging, as the entire sequence was all sound design with no music score. Every detail becomes so much more important since it’s so exposed.
Hans Zimmer‘s score is amazing! Did you get collaborate with him, so the score and sound design had a similar tone? Or, did you get to hear any of the score while you were working on the sound design?
EA: We love working with Hans, and as always, he composed something very special. Though we’ve collaborated directly in the past, on this one Gareth was really driving that sound design and music interaction. He had a very clear idea of where he wanted music to take the lead, and vice versa for sound design.
Hans created “mock-ups” for us all to work with, which is so important because these days scoring sessions happen to close to the final mix. With “mock-ups,” we can tune our sound design rhythmically and tonally before we hit the mixing stage.
The final act of the film is where we all agreed, “music needs to lead here!” emotionally.
Did you do any custom recording for this film? If so, what did you record and how was it used in the sound design? Also, any helpful sound libraries?
EA: There was a bunch of recording done specifically for this movie. My absolute favorite recordings that I have are from decades of traveling around the world, especially Southeast Asia. Over many trips, I did sound-recording expeditions into the jungles of Cambodia, Vietnam, and Thailand. And The Creator was the perfect opportunity to put all these exotic sounds to use.
For The Creator, we did not use any sound libraries, only our own recordings.
How was working on the sound of The Creator a unique experience for you?
EA: Gareth is amazing to work with; it’s a non-stop creative experience. It’s hard to think of many filmmakers who are as passionate, as “out of the box,” and as tireless as him.
This was a really hard movie to pull off, but also one of the most inspiring and creatively thrilling of my career. I hope audiences hear it in Dolby Atmos, Dolby Vision, and IMAX, for the fullest and most pure experience.
A big thanks to Erik Aadahl for giving us a behind-the-scenes look at the sound of The Creator and to Jennifer Walden for the interview!
Please share this:
-
18 %OFF