In the lead-up to the release of The Mandalorian, fans were excited for the first-ever live-action Star Wars TV show. But I don’t think anyone anticipated just how groundbreaking the filmmaking of this Disney+ series would be. Spearheaded by creator Jon Favreau, much of The Mandalorian was shot on a soundstage in something called a “volume” – which isn’t entirely new, as The Star Wars prequels were largely shot on a soundstage using a bluescreen background that would eventually be filled in with visual effects. But what set The Mandalorian apart was that instead of using a blank background, the set was filled with LED panels that would render the actual VFX backgrounds in real-time. This meant that when Oscar-nominated cinematographer Greig Fraser set about lighting a scene on the outpost world of Nevarro, he didn’t have to imagine what the background would look like. It actually appeared in front of him, informing every single lighting choice made for the actors and physical props.
It’s not hyperbole to say this is absolutely game-changing technology, and when I recently spoke with Fraser for an extended (and wide-ranging) interview about his work and career as part of our Collider Connected series, he noted how significantly the technology had changed between the time he shot Rogue One and The Mandalorian:
“Nothing like this had been put together specifically. There were things like it on Rogue One, which effectively was the genesis of the concept with ILM. On Rogue One we built a kind of a volume. Around the spaceships we built a horseshoe and a lid and an uplight, and we effectively built the same concept in terms of lighting, but we didn’t have a real-time 3D gaming engine interaction. We didn’t have that because remember that was 2015, and even though that’s not long ago, it’s long enough ago that the LED panels were 9mm then and now they’re 2.4mm on The Mandalorian. So that tells you how much the technology has progressed in a couple of years.”
So on Rogue One, Fraser and his team could create lighting scenarios for the backgrounds, but they couldn’t render the actual backgrounds in real-time like they did on The Mandalorian. And as the tremendously talented DP explains, this groundbreaking technology was borne out of a desire to make a Star Wars TV show viable from a budget perspective:
“Doing a Star Wars TV show could be prohibitively expensive because Star Wars requires a lot of prop building and a lot of character building, so we wanted to – with ILM’s help – be able to make it a financially viable option to solve all the problems that you have with shooting a blue screen environment.”
Why is rendering the backgrounds in real-time such a game-changer for cinematography? Fraser explains:
“If you go into a studio without a set effectively, you’ve got a blue screen. As a DP you have to light it of what you think it should look like. You don’t have any reference of what the background looks like. You might have some concepts, but effectively you’re lighting it as what you think it should look like. You’re framing it as what you think it should look like.”
On The Mandalorian, they didn’t have to imagine what the sets would look like. They could see them with their own eyes.
Fraser says the pressure was on because a lot of money had been invested to get this technology off the ground. They held camera tests in June 2018 before filming began in October, and luckily it worked:
“It was a very rewarding experience. There was a lot riding on ILM’s shoulders at that point, a lot riding on my shoulders, a lot riding on Jon’s shoulders, because there was a lot of money invested in the hardware. If this technology didn’t work, if we turn up on Day One, everybody’s done their job – costume’s done their job, everyone knows their lines – if we turn up on Day One and it does not work, there is no Plan B. All these problems that could occur we were trying to get ahead of and pre-empt. Thankfully the worst did not happen and we always had something to shoot.”
It looks absolutely incredible, as glimpsed on the Disney+ behind-the-scenes series Disney Gallery: The Mandalorian. It’s nearly impossible to tell what was shot on a soundstage and what was shot outdoors, and it lends a cinematic touch to the entire aesthetic. Even props in the background could be rendered using this game engine. Again, it’s not an exaggeration to say that what Fraser and his fellow Mandalorian cinematographer Baz Idoine accomplished here is groundbreaking. And it’s only the beginning.
Fraser envisions a future in which almost every major project uses this technology, from big Hollywood blockbusters down to indies:
“I see a world where almost every film will use this technology in some way, shape or form. Be it from a $250 million blockbuster down to a $2 million independent movie using it for one sequence that they dry hire a studio that’s already been built and they get in there like a location. So I believe when the technology kicks on and gets widely adopted, when people understand what it can do, I believe it’ll be used quite a lot.”
Indeed, as Hollywood is still trying to figure out how to safely return to production in the middle of a pandemic, and with various countries closing their borders to the United States over health concerns, it’s not hard to imagine this technology could be used to transport actors to international locales without actually having to go there. And unlike blue screen technology, which provides a challenge to cinematographers to properly light the foreground, this new tech makes the transition from location to soundstage work seamless.
Check out the video interview clip above for more from Fraser on The Mandalorian, including what it was like to light “Baby Yoda” and how the technology has been improved for Season 2. And look for the full Collider Connected interview only on Collider on Monday, July 6th.
For more on The Mandalorian, check out what we learned from the behind-the-scenes docuseries.