Here, they talk about honoring that '70s aesthetic while also creating a 'scoundrel adventure' inspired by Rogue One: A Star Wars Story and Solo: A Star Wars Story. They talk about their creative and technical approach to creating an immersive and reactive world, capturing fresh sounds on field recording trips to Australia, Tasmania, and locations around Sweden, using slapback, bubblespace, and spatial seeds system in the Snowdrop engine, and much, much more!
Interview by Jennifer Walden, photos courtesy of Ubisoft; Massive Entertainment
November 21st will be a big day for Star Wars Outlaws. Not only will the game get its release on Steam, but there will be a significant dev update (1.4) that improves combat, stealth, and character controls. (Read more on this update at Ubisoft.)
Star Wars Outlaws – developed by Massive Entertainment and published by Ubisoft – is set amid the original Star Wars Trilogy films, so the sound of those films and that particular era in the Star Wars universe was a source of significant inspiration for the game. The Massive Entertainment sound team was also inspired by the scoundrel adventure of Solo: A Star Wars Story and the grittiness of Rogue One: A Star Wars Story.
Here, Massive Entertainment sound team members Simon Koudriavtsev (Audio Director), Martin Weissberg (Principal Sound Designer), Malin Arvidsson (Senior Sound Designer), Adam Oakley (Senior Sound Designer), Jacob Coles (Lead Audio Designer), and Charles Pateman (Lead Voice Designer) talk about how they honored the warm, analog ’70s sound in their work on the game, how they designed the sound of biomes, weapons, and creatures like Nix, how they handled the sounds of combat and stealth, and how they managed vocal variety and vocal processing for the many different sentient lifeforms in the game’s world. They also talk about field recording trips to places like Australia and Tasmania, making the most of the Snowdrop engine’s bespoke systems to create immersion, crafting a dynamic game mix, and so much more!
Star Wars Outlaws: Official Story Trailer
Why did you choose the late ’70s sci-fi aesthetic for Star Wars Outlaws? In what ways did the sound team honor this aesthetic in your approach to the sound?
Martin Weissberg (MW): We realized quickly while talking with the team that everybody had a different interpretation of what defines a Star Wars soundscape. The first movie was released in 1977, and the franchise has been thriving to this day. That’s a lot of legacy to respect and pick from. As we wanted to sound close to the movies, we had to choose which ones would be our main source of inspiration.
Considering that our game takes place between Star Wars: The Empire Strikes Back and Star Wars: Return of the Jedi, it seemed interesting to pay homage to the first Trilogy and the films that inspired them (like older Westerns or Samurai movies). It was also an opportunity to create a game that would sound different and hopefully stand out within the video game world.
Meanwhile, the tech art and engine teams were also working on a visual tool emulating old anamorphic lenses through distortion, bokeh, and lens flare as well as film grain, so it was clear to us that we were all going in the same direction.
…we decided that our game would be a contemporary take on ‘Star Wars’ sound that would retain a lot of character from the original Trilogy.
With Charles [Pateman], Jacob [Coles], and Simon [Koudriavtsev] we decided that our game would be a contemporary take on Star Wars sound that would retain a lot of character from the original Trilogy. We were also inspired by films like Rogue One: A Star Wars Story, as that story takes place within the same era as our game, and Solo: A Star Wars Story as it explores the underworld and the criminal factions. And finally, The Mandalorian for its tremendous Western-inspired sound editing.
We came up with a few guidelines and key points that would be communicated to the sound and voice design teams. It was more about the end result rather than tools, DAWs, or plugins. A few of those were: “tape flavored,” “warmth over precision,” “nothing too piercing,” “analog over digital,” “favoring acoustic recordings.” Every aspect of the audio had to match the palette we wanted to achieve without it feeling like a parody.
To do so, we combined our own recordings with older sound banks, that would have been recorded on tape, and assets that Lucasfilm Games kindly provided us with. It was more about bringing the clean and pristine sounds to the warmth of older recordings than the contrary. We used a Revox B77 MK2 for that purpose (it unfortunately did not work for long!) and Charles created an especially useful EQ curve based on the frequency response of his Nagra tape recorder, that ended up being used on some sounds and most of the dialogue. Selecting the sources was the most important part of the work and refraining from cleaning the older sounds too much or being too surgical with our modern tools was essential also.
Analog synths were favored, and many were used across the game soundscape. Joel Green, a senior sound designer on the team, used his Dreadbox for the UI and we also used some Moog and Arturia PolyBrute recordings all over the game (speeder, blasters, alarms), or one layer of the Trailblazer was made of a saturated pad made with a Dave Smith Instruments OB-6.
We also utilized a wide range of saturation outboards and plugins as well as EQ’s with broad curves to complement the already-rounded flavor…
We also utilized a wide range of saturation outboards and plugins as well as EQ’s with broad curves to complement the already-rounded flavor from the combination of sources already selected. We researched what gear was available to Ben Burtt for the original Trilogy and tried to use it when possible or at least a plugin emulation version, including trying out multiple flangers until we found the ones that suited our approach the best.
But as said earlier, the team was free to use the tools they wanted and knew the best; what made the difference was that everybody on the team had a clear vision of what we were trying to achieve when designing the assets with a mixture of new, older, and tape/tube flavored sound sources.
Can you talk about your collaboration with composer Wilbert Roget? How did the music and sound effects intertwine to create the cohesive sound of Star Wars Outlaws?
Simon Koudriavtsev (SK): The ambitious goal we had for the project from the start was to figure out how to build and expand on the well-known Star Wars music style. How do we stay true to the IP, but innovate in a way that fits with the “scoundrel adventure” direction of our game?
…Wilbert really took the “scoundrel” approach to heart and used things like found objects percussion or solo instruments played in unconventional ways…
One of the first steps we took with Wilbert, our composer, was to identify three core pillars. We knew that we had to have the big orchestral elements as defined by John Williams in the films. We also wanted to reflect the “grit” of the underground syndicates by introducing more analog, electronic elements. Finally, Wilbert really took the “scoundrel” approach to heart and used things like found objects percussion or solo instruments played in unconventional ways to add that third layer that might not be very common in Star Wars and this really gave the music a character of its own.
The next step was to define the important musical themes in the game and story, things like planet and moon, syndicate and character themes. The very first theme Wilbert wrote for the game was “Kay Vess, the Outlaw.” We all immediately fell in love with it, not only because of the instantly gripping melody, but also because it was hitting all three of those core pillars at once.
When it came time for the final recordings, we recorded the score with two different orchestras: Royal Scottish National Orchestra in Glasgow, with 77 musicians and a huge room for that bigger classical Star Wars sound, and Ocean Way in Nashville with a slightly smaller group of musicians, where we recorded strings and brass separately for more opportunities in the final mix and the music system.
…senior music designer Erik Jacobson did a fantastic job of creating a music system that seamlessly integrates with an open world game.
Speaking of the music system, our senior music designer Erik Jacobson did a fantastic job of creating a music system that seamlessly integrates with an open world game. It’s not an easy task, especially given the sometimes erratic nature of complex orchestral pieces. But besides the music flowing seamlessly based on player and enemy actions, we always were mindful of leaving space in the mix for sound effects, and not overusing music everywhere.
We also spent a lot of time tweaking the mix to ensure that music wouldn’t fight with critical voice or sound effects. We really wanted to stay away from the classic “pumping” effect, so we used a lot of smart layering systems and dynamic EQs instead of HDR or traditional sidechaining.
Can you talk about your field recording trips for the game: where you went, what you captured, and how (mics/recorders) you captured it? Later, how was this material helpful in creating the ‘organic’ and natural sound/feel of the game?
MW: One thing we care about at Massive Entertainment is bringing something fresh to the player and, when it comes to sounds, bespoke recordings are key.
For Star Wars Outlaws, we went as far as we could. We had specific needs in terms of ambience as we wanted the locations to sound vastly different from each other. Every planet or moon had different biomes and different weather to be illustrated with sound. So, from several busy cities to savanna, desert, or tropical jungles we had to cover many different scenes and landscapes. As Toshara is our own original creation in collaboration with Lucasfilm Games, we really wanted this planet to have its own signature.
Our senior sound designer Simon “Sigi” Siegfried Klär went to Australia and Tasmania for a week where he recorded nature sounds, caves, and even animals like Tasmanian devils, sea lions, and dolphins that were used for many different wildlife animals, birds, and ambience pads. These animal recordings specifically were used in many distinct locations in the game.
Sigi and Simon also went to the north of Sweden for another week to record snowstorms, wind pads, and rivers as well as some snow foley. They ended up being used in Kijimi, of course, but some of the snowy winds also ended up on Toshara since they had a sandy quality to them.
I had the chance to go to Tanzania to record a variety of pads and animals in the wild for about 20 days. We chose Tanzania for its variety of landscapes existing within a 300km radius. There I was able to record savannas, jungles, and desertic winds on Kilimanjaro as well as elephants, lions, hyenas, frogs, and insects. They were used mostly in Toshara and Akiva, but some of the sounds even ended up in the blasters.
Popular on A Sound Effect right now - article continues below:
-
25 %OFFEnds 1734389999
-
19 %OFF
To finish with ambience, there are a lot of sounds that are coming from our personal collections and recordings in diverse places – like Southeast Asia recordings or walla from Morocco. We’re lucky to have a team that knows and loves to record. That comes with a lot of perks!
We also had a lot of smaller yet as important field recording trips. We went to record buttons and levers in a museum based in Malmö which allowed us to get a good amount of different sounds for UI (coupled with Joel’s synth) but also for the Trailblazer cockpit or console interaction throughout the game. The yoke in a plane’s cockpit was still attached to some mechanical parts and had a nice deep sliding sound that Adam ended up using for ND-5 servos and was a nice nod to Rogue One: A Star Wars Story’s approach for K-2SO. We also went to record air exhausts from trucks and an industrial crane that were used for the speeder, various doors, and in a sequence during one of the main missions. Finally, we had the chance to record robot servos at Fanuc in Malmö that ended on many lifts and doors in the game.
Added together, these recordings allowed us to create our own take on Star Wars and to play our part while respecting the existing legacy. I think it helped us render something more organic as they were acoustic sources that were recorded with a purpose in mind, knowing that they would need to fit a specific aesthetic while cohabiting or being layered with sounds that every fan of Star Wars was expecting to be in our game.
To end with the gear side of recording, we’ve been using classic recording equipment and when specific treatments were needed, they were applied in the box or with outboards. The recorders we used were: Sound Devices Mix Pres and Sound Devices 722, Sony PCM-D100 and D50 (which tragically stopped working in Tanzania), Zoom F8, F6 and F3 and an AETA 4MinX. For mics, we used: Sennheiser MKH 8040s, MHK 416 and MKH 50, Schoeps CCM4, DPA 4060s, and a Sanken CO100k and a parabolic mic. Finally, we used the HC-22 shotgun mic from Rycote combined with their Windshield, and their PCS quick release which proved itself pretty handy when stuck in a car on a bumpy road in Tanzania.
In terms of your technical and creative approaches, how did you make this world feel alive and reactive? How did you make the sounds feel like they’re in the world and reacting to the environments they’re happening in?
SK: Alive and reactive-sounding worlds are something that we take pride in creating. Aside from putting a lot of effort into getting unique-sounding source material that helped to breathe life into our game, we also managed to put together the best audio tech from The Division games and merge it with new tech from Avatar: Frontiers of Pandora, and this led us to have the best toolset possible for creating immersive and reactive environments.
I gave a GDC speech about environmental acoustics in 2020, which explains a bit about the slapback and bubblespace systems that we also used in Outlaws.
Aside from these, we also used the spatial seeds system from Avatar: Frontiers of Pandora, which Malin can expand more on later.
Snowdrop’s ray-traced propagation system helped with worldizing sounds even further and added that extra layer of depth for obstructed/occluded sounds.
Snowdrop’s ray-traced propagation system helped with worldizing sounds even further and added that extra layer of depth for obstructed/occluded sounds. We also spent time carefully crafting the ambience systems to feel reactive, whether for combat or weather conditions. We really wanted the world to feel alive and somewhat logical, despite it being in a galaxy far, far away. Small details like rain dropping onto tarps of Akiva, or strange-sounding alien fauna that stops making noise when you shoot all add up to an immersive, dynamic experience that we are super proud of.
Aside from ambience, in collaboration with Ubisoft Shanghai, we crafted a lot of wildlife creatures, each with their own behavior and reactions. Some of the creatures have been seen in the films, and some have been unique creations. The creatures were unique per planet and could be anything from the smallest particles to a huge Krayt Dragon. Having this wide range of creatures, complete with foley and vocalizations also really added to the “alive” aspect.
I can’t finish writing this section without mentioning all the amazing work done by the voice team on this project. Charles and his team really brought the world to life with authentic voice treatments, fantastic walla, and all the small quirky alien noises you can hear in the cantinas and cities.
Can you talk about your approach to vocal processing in the game? What were some of your challenges in creating so many different dialogue processes?
Charles Pateman (CP): Vocal variety is a huge part of Star Wars’s distinctive sonic identity and plays a huge role in making the setting feel like the big, vibrant galaxy we see in the films. Getting this right for the era we were exploring was very important to us and we worked closely with Lucasfilm Games to make sure we were remaining authentic to the franchise.
Keeping things sounding cohesive within dialogue and across audio is important for any project, and with well over four hundred thousand assets in English, localized languages, and constructed languages like Huttese this was going to be an enormous challenge!
…with well over four hundred thousand assets in English, localized languages, and constructed languages like Huttese this was going to be an enormous challenge!
We used a combination of actor’s performance, runtime, and offline processes depending on the content. Performance is often at the core of Star Wars aliens and working with actors to keep them grounded and on tone is a pre-requisite for any vocal processing. Radio calls or collectible recordings are a good example of where we opted for runtime processing as they could involve other sounds that needed to be played separately while sharing the same processing. Everything in game had an offline component though regardless of vocal FX processing, as no matter how human or alien a voice was, everything needed a unifying character that supported the style of the game’s audio while providing a reliable anchor point for the mix.
Strict microphone use, leveling, and light compression is a solid start, but it’s the EQ and tape emulation in our premaster processing that Martin mentioned earlier that did the hard work to make the game’s dialogue feel cohesive and this needed to be refined throughout production as the game’s audio evolved. Vocal FX, masks, or droid processing and other common variables like projection level, vocal register, acoustic/transmission or custom adjustments for specific characters could then be folded into these base FX chains. To deliver detail consistently at this scale across a department of six voice designers, automation was a must-have.
Our FX chains were handled using Reaper‘s command line batch tool, and GoldWave would add and strip silence to account for drift or temporal effects.
Our lead audio programmer Martin Wallin created a mastering tool that would read in metadata from our text database and step through a series of processing stages. Our FX chains were handled using Reaper‘s command line batch tool, and GoldWave would add and strip silence to account for drift or temporal effects. We could then make our FX chains modular and turn those hundreds of thousands of potential combinations of plugins into a little over 100 individual FX chains, that could be applied incrementally in processing stages run by the mastering tool.
With automation handling the bulk of the grunt work, we could focus more time on the content that required manual work such as touch-ups, special characters, or specific moments, while ensuring everything felt cohesive and authentic to the universe. Critically though, we could develop and establish our FX chains early, and record as late as possible, and it allowed the voice design team to dedicate more time to adding the variety and detail that the game needed.
Can you talk about your approach to creating the ambiences/backgrounds for the different locations in the game? What was the most challenging location to design?
Malin Arvidsson (MA): Star Wars Outlaws was an exciting project for ambiences since we had 5 different locations with different landscapes and weather. It gave us an opportunity to do a lot of varied soundscapes.
I was lucky enough to work on Akiva, which is a lush location with plenty of fauna and dense vegetation that would move with the wind. One thing I was excited about was finally using all those strange-sounding birds I’ve recorded in the past that are difficult to fit into any other project as they will stand out too much.
…adding wind to props like plants, tarpaulins, and metal structures helps create some variety in this kind of landscape.
Sigi had an interesting challenge for his work on Toshara. It’s a very windy moon and it required making wind sound interesting. I think he did an amazing job bringing life to Toshara! We could also make use of our wind emitters system, both on Toshara and on Tatooine, where we can detect materials, wind direction, wind strength, how close you are to an edge, and more, and trigger different sounds accordingly. Also adding wind to props like plants, tarpaulins, and metal structures helps create some variety in this kind of landscape.
Mostly, we used simple pads playing in quad, containing mainly an airtone or some subtle wind, and we complemented that with our spatial seeds system, which is part of our Snowdrop ambience tech.
The voice team did an amazing job adding life to cities and cantinas with their walla systems, so we had an easier job making these areas sound interesting.
As a brief explanation, spatial seeds are sound emitters that are attached to props. As you play the game, they spawn in and out, based on a budget that we can set per area, time of day, and any other factor that you want to use, and also depend on distance to the player and to each other. This ensures we have a constantly changing soundscape.
The voice team did an amazing job adding life to cities and cantinas with their walla systems, so we had an easier job making these areas sound interesting.
And of course, a lot of the ambient sounds were covered with all the NPC droids and vehicles moving about in busy areas.
How does your sonic approach to the terrestrial locations compare to the ones in space?
SK: As Martin mentioned previously, we spent a lot of time researching and recording different materials to be able to cater to the vastly different biomes of our game. Each location had a specific art direction “world bible,” which we used as a base for our sonic approach. Our goal was to make them sound unique and different from each other, whilst keeping in line with the art direction. Whether it would be the cold, mystical sounds of Kijimi, the grass moving through wind on Toshara, the desert sand swirls of Tatooine, or the raindrops on leaves in the Akiva jungle, we wanted to ensure that the ambience bed felt dynamic, realistic, but sometimes also surprising.
When it comes to space, we focused less on constant “space-pads,” and more on the discoveries of things like space-stations, whoosh sounds of asteroids flying by you, or even space jellyfish :)
What was your inspiration for your weapons’ designs? Can you talk about what went into the sounds for your favorite weapons in the game?
MW: Like for the rest of the game, primarily we wanted the weapons to be true to the movies. As mentioned earlier, we wanted our game to pay homage to the first Trilogy so the weapons needed to follow the same direction as they would be in the forefront of the game’s soundscape.
We wanted something that would sound ‘scoundrel’ with a slight Western tone to it, as opposed to a very industrial sound from an Imperial blaster pistol.
Kay’s blaster, for example, was made from an acoustic recording (bullet zaps/pass-bys and ricochets as well as Slinky or metal cable) and for the laser part, I went with a bit of Moog filter sweep. It was important that it represented Kay as a character in the same manner as Han Solo’s iconic blaster. We wanted something that would sound “scoundrel” with a slight Western tone to it, as opposed to a very industrial sound from an Imperial blaster pistol. Kay’s blaster needed to feel tight and not too heavy as Kay is an agile and light character who uses Nix, her cunning tricks, and her finesse to get out of tricky situations.
The second part of her blaster was made of a mix of older gun recordings from Sound Ideas that had very warm tones, combined with newer recordings saturated with the Culture Vulture then further treated in the box. It was interesting to use gun sounds in a creative way where we wouldn’t care about acoustics too much at creation, while trying to get the weight and feel to support the laser layer.
Finally, a thump and sub-layer were created out of a PZM recording and a very low Moog filter sweep to add punch to the sound in a subtle way. While being a modern game we wanted to favor a more old-school vibe over ultra punchy sounds.
It’s difficult to choose my favorite sounding weapon for this game but I had a lot of fun designing the weapons of Kay, Vail, and ND-5 as I was trying to represent their character through them.
…I had a lot of fun designing the weapons of Kay, Vail, and ND-5 as I was trying to represent their character through them.
I think ND-5 is probably the one I’m the proudest of as I tried to get a bigger/meaner sound for him than for the rest of the main characters. It was heavily influenced by The Mandalorian in their conception and by old Westerns like The Good, The Bad and The Ugly. It only has a few layers from “Larger Than Life” (Sound Ideas library) old laser sounds that I think were made from heavily distorted ricochets and bullet pass-bys. The creation process was similar to Kay’s blaster except for the real gun sounds. I used Mossberg shotguns shooting slugs instead of smaller hand pistols.
But I wasn’t working alone on the blaster sounds for this game and had precious help from Mattia Cellotto, our expert sound designer. He seamlessly picked up the direction for weapon sound design after joining midway through production, and brought his own distinct ideas and elements to the table.
There’s a lot of running around in this game! Can you talk about the foley work? What was your approach to the hand-to-hand combat sounds?
Adam Oakley (AO): It was quite an undertaking! The foley was provided by the team over at Ubisoft Studio Alice and for sure, having five locations of varying biomes made the request quite extensive. One of our key focuses going in was having lots of diversity in textures and surfaces, but also finding ways to keep things feeling cohesive as you move through the game.
One of our key focuses…was having lots of diversity in textures and surfaces, but also finding ways to keep things feeling cohesive as you move through the game.
We relied a lot on sweeteners to help with this as it allowed us to keep core ingredients consistent but supplement for variety. David Driggers (senior audio programmer) helped us create a system specifically for wetness and snow, for example, which allowed us to balance between sweetener content/contribution and underlying surface depending on depth. So, as you run over an area with a smattering of snow, you’ll predominantly hear the floor surface with occasional crunch played on top but as you move into the deeper stuff, the ground material is gradually taken over by respective snow content.
Whilst Martin and I were visiting Montreal for one of the foley sessions, Alexis Farand (foley artist) managed to source a super interesting ’80s cruiser motorcycle, which we wheeled into the studio to record a bunch of layers for Kay’s speeder. Everything from mounting, suspension, switches, impacts, scrapes, clunks, rattles – it became a super useful palate of sounds to help ground the Speeder in reality.
I did a recording session here at Massive Entertainment where I smashed up a bunch of things like a typewriter and (poetically) a toy stormtrooper helmet…
The hand-to-hand combat was a double-edged sword as we had the quicker-paced melee which tended to happen in more frantic scenarios, and the more stealth-oriented takedowns. We aimed to keep melee quite snappy whilst offering enough tactility for player feedback that was punchy and satisfying. I did a recording session here at Massive Entertainment where I smashed up a bunch of things like a typewriter and (poetically) a toy stormtrooper helmet, which all helped to give some additional variation and specificity depending on which NPC faction you’re dueling.
For takedowns, we took a little more creative freedom, but it was quite a challenge to find a formula that fit both Kay’s character and the overarching audio direction. Given the number of bespoke takedown animations in the game, Istvan Stenberg (an audio director at Ubisoft Stockholm) and I went through multiple iteration stages before eventually settling on a recipe which we felt struck the right balance.
A lot of the takedowns had multiple parts like grabs or double impacts, and we found that approaching these from an almost musical perspective really helped with the dynamic feel. Being conscious of rhythm and the relationship between softer and harder impacts allowed us to shape the whole movement around key points of the takedown. We supplemented this with a subtle use of whooshes, primarily to highlight certain visual cues like on aerial takedowns or with camera swings; hopefully combining to hit the more cinematic notes we felt these actions required.
What went into developing the sound of Nix?
CP: We started with a prototype of Nix developed by Martin Weissberg, using cheetahs combined with recordings of vocalizations of his newborn son. What we learned from this prototype was that for a character like Nix to have soul, we really needed expression and intention, and the only way we were going to get that was through a voice actor’s performance.
We decided to enlist the expertise of Dee Bradley Baker, who has extensive experience and insight in creating non-verbal characters and is a long-time Star Wars voice acting veteran.
We decided to enlist the expertise of Dee Bradley Baker, who has extensive experience and insight in creating non-verbal characters and is a long-time Star Wars voice acting veteran.
He worked closely with Oskar Hansson (voice designer), who was handling the design and implementation side of things. Oskar would hunt through the game searching for opportunities and the two of them would work through the various scenes and gameplay scenarios, exchanging ideas and recording. Oskar would then edit and process those performances afterward and bring the results to the next recording session where they’d listen back and iterate from there, over time building the character into a living, breathing merqaal with oodles of personality.
Oskar decided to use some recordings of Martin’s son for Nix’s snoring, which finally came full circle when Martin found his son snuggled up with his Nix plushie snoring away!
We very much wanted every utterance Nix made to be tailored to the moment, with content either being recorded bespoke for cutscenes and key gameplay moments or hand-authored from the Nix “palette” we had developed. To ensure we kept his character consistent, Oskar was the only voice designer who created these “lines” as he arguably understood the character every bit as well as Dee did.
In terms of post-production, Nix was light on processing and heavy on editorial, with a degree of pitching and reversing folded in to eliminate what little human qualities remained. I’d say about 98% of the material we used for him was from Dee with only the odd animal or two spliced in here and there where a specific need arose. As a nod to Nix’s origins, Oskar decided to use some recordings of Martin’s son for Nix’s snoring, which finally came full circle when Martin found his son snuggled up with his Nix plushie snoring away!
How was the sound team able to take advantage of the Snowdrop engine? Also, did you use any middleware for Star Wars Outlaws, and if so, can you talk about how that was helpful in conjunction with the Snowdrop engine?
Jacob Coles (JC): As Simon mentioned previously, we started in a clean branch of Snowdrop, where we were able to merge the best of the tech we had from The Division 2 with the best of the tech we had from Avatar: Frontiers of Pandora, and even improvements from other Ubisoft titles using Snowdrop.
…we were able to merge the best of the tech we had from ‘The Division 2’ with the best of the tech we had from ‘Avatar: Frontiers of Pandora’…
Aside from the aforementioned slapback, bubblespace, and spatial seeds system, we also had raytraced audio propagation, meaning that the way sound travels through a room or a portal, sounded very accurate and added that extra level of realism. We go more in-depth on how it works in this blog article.
For Star Wars Outlaws, since we have a stealth element to the gameplay, this added an extra layer of suspense. You could hear sounds of enemies, or a different environment bleed through a door when it opened, creating a sense of foreshadowing.
As for middleware, we use Audiokinetic’s Wwise on all the Snowdrop projects. It’s a fantastic tool, and we have a great relationship with them when it comes to support and developing new features.
What was your approach to mixing Star Wars Outlaws? How does your approach to mixing combat situations compare to stealth situations? What about exploration? What were your biggest challenges in mixing this game?
SK: Mixing Star Wars Outlaws was not an easy task. It’s an enormous game, full of varied boots on the ground and space gameplay, with opportunities around every corner. Even though we tried to “mix as we go,” a lot of the crucial elements (like final voice or music) only came towards the end of the production, so we spent around a month in our 7.1.4 Mix Room, playing through the game as best we could, trying to catch anything that didn’t sit right in the mix.
The biggest challenges were space combat…, location-side combat…, plus the added layer of stealth), and finally the clarity of the narrative.
The biggest challenges were space combat (as it was our first time doing it, and it also has a lot of fan expectations), location-side combat (as things could get fairly busy, plus the added layer of stealth), and finally the clarity of the narrative. Our main mixing philosophy was similar to that of the movies: voice and music first, and the sound effects support it. We didn’t want to create a fatiguing, over-the-top experience.
For space combat, we spent a lot of time tuning the sound of Kay’s ship, the Trailblazer. Our principal audio designer Martin [Weissberg] had developed a great palette to work from. But now we had to decide how and when it should be audible, whilst also taking into consideration all the upgrades/modifications you could make to the ship. We had multiple RTPCs that affected the engine and weapon sounds, based on movement input and in/out combat states.
We also utilized HDR to make the “stunt moves” pop out extra in the mix. Then we moved on to tuning all of the NPC ship engines and their weapons, working a lot of custom attenuation curves for the different sizes of the ships, as well as what would be most audible in and out of combat. Here we had great support from Ubisoft Bucharest, with their lead sound designer, Silviu Tanase spending a lot of time tweaking all the enemy ships into a workable state for the mix.
Once we had a good relationship between the player and enemy ships, we then added Wilbert’s amazing music on top of the mix…
One final feature we added during the mix was to take into account if the enemy ships were hostile or not towards the player, to again help with de-cluttering the soundscape. Once we had a good relationship between the player and enemy ships, we then added Wilbert’s amazing music on top of the mix, and tested everything against voice levels, to ensure that Kay and her crew were audible, even in the biggest of space fights. To make the voice pop through, we utilized dynamic EQ ducking; this was done for a more “seamless” ducking effect, and to avoid that “pumping” feeling.
MW: I think one good trick was also to use MS saturation and EQing during the phases where the player is using Kay’s speeder or the Trailblazer. By slightly pushing the music to the sides, we were able to carve space in the center for the vehicles and the voices to fill. It gave a nice balance between music and sound effects from the start without too much drastic adjustments.
SK: For location-side combat, we created a “bucket” system, which would group together NPC enemies into different buckets of importance, based on distance and threat to the player. We then made sure that the ones closest or most threatening to you were more audible than the others, and this created a nice base from which to tune the combat mix. Here again, we wanted to stay away from a fatiguing experience and create more room for a cinematic mix.
…we had our dynamic music system adjust the score to really add to the stealth vs. combat feeling.
For stealth gameplay, we ensured that NPC footsteps and VO were slightly more audible when hiding, and on top of it all, we had our dynamic music system adjust the score to really add to the stealth vs. combat feeling. All the music layers would react to things like the number of enemies, their threat to you, or being spotted and engaging in full combat.
As our main goal was about telling a scoundrel story, we spent a lot of time making sure that all the critical features in the game had a way of creating space in the mix for voice. Whether it was combat, stealth, or exploration on the speeder, we constantly tested everything against voice, utilizing dynamic EQ ducking, mid-side compression, or mix states, to ensure that the sound effects wouldn’t collide with the critical narrative. For exploration specifically, we decided to not “drown” everything in music, and instead opted for an occasional music “flavors” approach. Planet/moon (or space) specific exploration music would play occasionally, creating nice moments for the player, but leaving a lot of mix room for ambience/weather/wildlife sounds in between those moments.
Towards the end of the mix, we also spent time creating specific mastering settings for TV speakers, soundbars, and headphones, as well as a high and low (AKA night) dynamic range modes.
What are you most proud of in terms of your sound work on Star Wars Outlaws?
SK: It’s hard to choose one thing, but if I must, it would be that we managed to figure out how to make a game sound and feel like Star Wars, whilst adding our own touch to it. I am incredibly proud and thankful to all of the talented audio staff on the project across all of our studios, all the wonderful composers we’ve collaborated with, our creative director Julian Gerighty, as well as Lucasfilm Games for trusting me with the audio vision.
A big thanks to Simon Koudriavtsev, Martin Weissberg, Malin Arvidsson, Adam Oakley, Jacob Coles, and Charles Pateman for giving us a behind-the-scenes look at the sound of Star Wars Outlaws and to Jennifer Walden for the interview!
Please share this:
-
50 %OFFEnds 1735685999
-
25 %OFFEnds 1734389999
-
19 %OFF