Lux Machina and the Leica RTC360 Provide 3D Data for Flying Augmented Reality Animations at Coachella 

Coachella. The massive, famous American desert music festival, with 250,000 attendees in 2024, is one of the largest music festivals in the world. Stages, concertgoers, artists and their entourages, vehicles, tents, crew, and so much more fill the lively festival grounds for three days of showstopping performances.
Leica RTC 360 at Coachella

Augmented reality (AR) experiences – digital 3D graphics that weave their way around big stages and above huge crowds, mediated by screens and smartphones – are often part of those performances. But keeping half a million eyeballs synced to the same experience is no mean feat. The complex logistics of so many moving parts requires the best technology for the job.

For Lux Machina, a developer of leading-edge technical video solutions for film, TV, and live events, that perfect technology was a Leica RTC360 laser scanner, which captured the entire festival grounds of Coachella. And while the final product of AR graphics floating and flying over audiences during a concert is truly stunning, it couldn’t happen without Lux Machina’s application of laser scanning technology.

The RTC360, Lux Machina, and a billion 3D data points

Lux Machina had a complicated task. To make sure that AR experiences during concerts by Grimes, Anyma, DJ Snake, and other artists went off flawlessly, everyone involved needed a precise, comprehensive 3D “digital twin” of the festival space. That’s where laser scanning with the RTC360 became essential.

Scans captured by the RTC360 contain vital 3D information that supported AR workflows for five artists that performed on the Main Stage and Sahara Tent during the live stream broadcast of Coachella. Grimes and Anyma had AR elements that incorporated character animations floating above both artists’ performances simultaneously, while 3D graphics flew out towards the audience from the screen for DJ Snake.

Leica RTC 360 at Coachella

To have these AR experiences look and feel as real as possible, like they are part of the real-world experience of Coachella, designers and engineers need to work with a digital twin of the real-world location. The digital twin is built from 3D laser scans, which offers several advantages over other measurement methods for capturing a large space like the Coachella stages and festival grounds in 3D, according to Wyatt Bartel, President of Lux Machina.

Quotation Mark

"I could have used a regular laser distance meter and just said where these particular 10 points are and then referenced and adjusted those, rather than the billion points of the stage and grounds itself, captured by the RTC360,” Bartel said. “But the stages themselves are crucial for our AR object occlusions. The density of the captured data directly influenced not only the alignment of AR graphics with the cameras used by the team, but helped embed the graphical artist elements into the environment.”

Quotation Mark

With Bartel leading the scanning mission, the project involved mapping a comprehensive three-dimensional digital twin of the festival grounds. To achieve this, the team performed 125 scans that provided the essential data for creating an accurate construct of the entire expanse of the polo field where the festival was staged, using Leica Cyclone FIELD 360 software to help pre-align this these scans before processing them back in the office.

Laser scanning captures data that is known as a “point cloud”, or a collection of 3D data points that are an exact digital representation of the physical objects, structures, and terrain scanned. The point clouds captured by Lux Machina replicated the enormity of the festival landscape, and the RTC360 specifically provided benefits for capturing Coachella.

The precision offered by the scanner ensured there were no misalignments or inaccuracies that could disrupt the AR graphics.

Alignment of Leica RTC360 data in Leica Cyclone FIELD 360
Quotation Mark

“The alignment is everything,” Bartel explained, “it unlocks the ability to superimpose the virtual elements onto the captured reality correctly, and is one of the foundational elements in the process of creating the AR experience.”

Quotation Mark

Bartel also noted that the RTC360’s long range and high precision was crucial in a large space. The digital twin of Coachella was captured in stages, focusing initially on the Main Stage before moving around to different locations. The primary purpose was to achieve a near-360-degree coverage of each setup location and the adjacent areas, providing as many data points as possible for the next step – the AR workflow.

Alignment, accuracy, and object occlusion for realistic AR elements

To enhance the realism of the AR projections during performances, Lux Machina also paid attention to “object occlusions:” the goal was to understand the entire 3D space to ensure realistic occlusions, where one object should (or shouldn’t) visually obscure another.

By understanding the detailed layout of the festival grounds with 3D data, the team could simulate a believable interaction between the AR objects and the environment, including the stages, tents, and everything else captured by the scanner.

“We needed to understand all of the 3D space between our cameras and one mile away to make sure that the palm trees are obscuring the object that the other stages far in the distance are obscuring,” Bartel said. This results in a much more realistic presentation of AR elements, able to “move” in front of and behind such objects, like ferris wheels and stages.

Leica RTC360 data in Leica Cyclone FIELD 360

However, the significance of accuracy extends further.

“We need to be able to have a dancer’s foot hit a mark where it needs to be. Otherwise, they’re going to collide with real objects,” he said. Any misalignment or inaccuracies could drastically impact the visual output, undermining the immersive experience for the audience by potentially causing distortions or accidental collisions within the AR graphics.

Such minute but critical details like occlusions contribute to the authenticity of the AR graphics, enhancing the audience’s perception of their depth and positioning among the real-world environment.

Laser scans become a 3D mesh in Unreal Engine

The collected data from laser scans were processed and converted into a 3D mesh—a “complete” version of a point cloud where tiny gaps between data points are connected and filled in to create a smooth 3D model—using Unreal Engine. In this case, the 3D mesh is of the entire Coachella festival grounds, and it is the digital twin required for AR workflows. With the 3D mesh ready to go, Lux Machina’s sister company, Halon Entertainment, augmented and optimized 3D animations and graphics from the musical artists’ teams and were able to create custom AR based around the accurate model of the Coachella festival grounds. As Halon producer Andrew Ritter puts it, “the digital twin eliminates the guesswork and lets us visualize and plan throughout the creative process.” Bartel said that “the 3D mesh is masked, and the content is composited into our PTZ and broadcast camera feeds and tracked using the UE compositor plug in, Stage Precision.” This software was critical for live camera tracking, lining up the 3D mesh with the actual stage or venue shown in the camera feeds. The 3D mesh not only represented the physical layout of the festival, but also allowed for precise overlay of AR graphics, enhancing the audience experience through a seamless blend of virtual elements within the physical festival space.

The RTC360: “It just works.”

The RTC360, with ease of use combined with high precision, facilitates the easy capture of scores of detailed point clouds while avoiding the guesswork associated with other methods.

“[With other methods,] instead of moving on to the artistic elements in Unreal Engine, you’re going to spend the rest of your day sort of just blocking out big cubes in your data trying to approximate what the real world is,” Bartel said, pointing out the efficiency of the RTC360 and the amount of precise data it captures.

“I think hitting start on a LiDAR scanner and coming back five minutes later to a complete point cloud — it just works. Theres no guessing, and that’s the whole point.”

Want to learn more?

Check out these related posts.