Marion Spates, VFX Supervisor for Avatar: The Last Airbender, Uses LiDAR to Fuel Creativity

Visual effects (VFX) have long seemed like magic to film and TV audiences—at their finest, they go unnoticed as “effects,” effortlessly blending with real-world elements within the frame to appear as realistic as possible.

Striving for seamless realism is core to Marion Spates’ approach. As a VFX Supervisor for Netflix’s Avatar: The Last Airbender, Spates employed multiple cutting-edge technologies—one of which was LiDAR—to help achieve his creative vision and make this fantastical world feel real.   

“Our creative approach is making sure we’re telling the story that Netflix is trying to tell,” Spates said.  “We have to make sure the story is right.”

Where the creative vision begins

Bridging the gap between reality—like the intricate geometry of a complex set for Avatar: The Last  Airbender—and imagination—in this case, VFX—is a challenging task that demands skill from creators in any artistic medium. 

That’s where a VFX veteran like Marion Spates comes in. To help with this endeavor, Spates turned to laser scanning to accurately capture the Avatar sets, which provided the 3D data required for the title’s complex VFX workflows.

Quotation Mark

“Because our show is such a high-end visual effects show, and because my goal has always been to make visual effects as real as possible, capturing the real-world data of the set gives us the foundation to make our visual effects look real,” Spates explained. “We used LiDAR to capture the environment, and the LiDAR told us the exact location of all the geometry in the frame. We scanned the entire set, on our own prior to or after the shooting, and used the LiDAR to also scan on a per-slate scan to know if any of our props or set pieces moved.”

Quotation Mark

Leica Geosystems Technical Specialist Andy Fontana used the Leica RTC360 laser scanner to create this high-resolution 3D “base scan” of the set at the start of the day. This foundational scan was crucial for tracking any changes over the course of the shoot.

Laser scanning to capture data for VFX in between takes

Once production started for the day, everyone was on their A-game, eyes for each department were alert, and most everyone was silent; the only communication behind the scenes was with walkie-talkies.  

“Ultimately, the LiDAR was used for camera tracking purposes, and we created a low-res representation of the geometry for lighting or any type of interactivity of our ‘bending’ elements,” Spates said. “For instance, there’s firelight cascading on the walls and water from water ‘bending’ – so when elements fall on the ground, we know where everything is in 3D relative to the shot that was captured.” 

Andy Fontana helped capture LiDAR data for the VFX department, and he explained how easy it became to take laser scans even during a very fast-paced filming environment, where time was of the essence.  

“I scanned as much as I possibly could, in between each slate, or even a couple of ‘takes’ within a scene,”  Fontana said. “With the Leica BLK360, I could capture complete scans within as little as 20 seconds between  takes, right when the director yelled ‘CUT!’ and actors scrambled out of the way with everyone behind  the scenes running out on stage to set up for the next shot.”  

“Capturing the data when we need it on set, during filming, is crucial – especially when it’s as fast as the  BLK360 – so it doesn’t interrupt filming,” Spates added. “It gives us the foundation to make our visual  effects look real.”

While Fontana would bounce around the set scanning between takes, Spates sat behind the director to watch each shot.   

Quotation Mark

“I'm looking for every aspect that gives me the ability to make the best visual effects possible,” Spates explained. “So, it could be a range of things: camera movement, camera framing. Is there enough tracking data for us to be able to solve for the camera? Are the blue screens the proper brightness? Is there anything that's going to cause me not to be able to achieve the quality of effect that I'm looking for within the frame?”

Quotation Mark

“If anything like that happens,” Spates continued, “I communicate with the director, and we figure out together how we can adjust the framing or adjust the visuals to be able to give us the best possible effect.”  

Through this collaborative process, while filming, Spates ensured that the VFX department—and other departments, too—had all the data they needed to create the visual effects in post. 

Capturing critical set geometry for post-production

For Spates, using LiDAR was also about ensuring that all the different departments in production were on the same page and working together.  

“We work on a show that has so many visual effects and such scope, we really try to work with other departments to find out where the divide is. What is the art department going to build and where do we take over? ” Spates said, “In VFX, we don’t want to build everything – we would rather extend from the  art, which is going to make the effects look more real.” 

And as the effects look more real, Spates’ creative vision truly comes into focus.  

“LiDAR was also used for any extensions we have to do, like extending buildings. We had large-scale environments and our base geometry is derived from our sets,” Spates said.

“Back in 2003 or 2004, before we used LiDAR, we’d have to go in and build geometry to-camera to make sure that any interaction in a frame would work, and it took hours,” he said. “With LiDAR, we capture the data, and we have the data available to us – and as a tracking artist, you’re able to process cameras  much faster because you’re using the LiDAR to tell you exactly where the points are for your tracking.” 

As the LiDAR data captured on set made its way through the VFX pipeline, it provided essential camera tracking information for other artists. In post-production, LiDAR became an essential tool, for Stanley Balu, Matchmove Lead at Image Engine.

“There were a couple of sets in the show, especially the village, where what we did is capture the camera motion from on-set, so that the other CG departments could use our CG camera which has been tracked,” Balu said. “LiDAR was very essential for this.” 

“With my work, if I don’t have LiDAR in it, it’s a headache because the camera is going to be here and there, and you cannot build a common geometry,” Balu continued. “But LiDAR gives us that common  geometry, and it means that about fifty percent of the work is already done.”

Avatar the Last Airbender Show Logo

Watch Avatar: The Last Airbender Season 1, streaming now on Netflix.

Want to learn more about our scanners for VFX? Let’s connect.

  • Equipment:
  • Credits:
    • VFX Supervisor: Marion Spates
    • Scanning Technician: Andy Fontana
    • Featured VFX studio: Image Engine   
    • Studio: Netflix 

Want to learn more?

Check out these related posts.

Mesh of Berlin Theater

How LiDAR Data Supports Workflows for Immersive Live Events

Imagine being at a concert and during your favorite song, LED lights rain down from the rigging above the stage in a perfect geometric display that lines up exactly with the stage. Or watching one online—say, a DJ Snake concert at Coachella—and seeing incredible AR graphics that burst out from the screen and seem as if they’re part of the festival itself. It’s all part of the show, but how does it work?

Read More