How LiDAR Data Supports Workflows for Immersive Live Events

Imagine being at a concert and during your favorite song, LED lights rain down from the rigging above the stage in a perfect geometric display that lines up exactly with the stage. Or watching one online—say, a DJ Snake concert at Coachella—and seeing incredible AR graphics that burst out from the screen and seem as if they’re part of the festival itself. It’s all part of the show, but how does it work?
Mesh of Berlin Theater

LiDAR (Light Detection and Ranging) technology has been increasingly used in live events for its ability to capture detailed spatial information and create highly accurate 3D models of environments, like stages, sets, stadiums, and even entire festival grounds of huge spaces like Coachella. In these cases, LiDAR is deployed in the form of laser scanners, which capture a “point cloud”—millions or even billions of 3D data points which represent the real world, and that form the basis for creating a 3D model, or a “digital twin,” of an environment. 

This helps event designers, producers, and other staff and stakeholders understand the real-world space and everything in it, whether it be a stage or a stadium, and successfully work with it to create powerful live experiences. 

Let’s look at how LiDAR is used in specific ways for live events. 

Venue Mapping and Planning

What if you could plan a space digitally well beforehand, knowing exact dimensions of what you’ll be working with? Laser scanners, such as the Leica BLK360, can create precise 3D models of event venues, both indoor and outdoor, which helps in layouts of stages, seating, lighting, and other infrastructure. Detailed spatial data also allows event planners to optimize the use of available space, ensuring safety and maximizing audience capacity.  

LiDAR can even take venue mapping and planning a step further. Planners can run simulations to test different setups and configurations virtually, and LiDAR data and resulting models can be shared with stakeholders for feedback and adjustments. 

Stage Design and Visualization

LiDAR scans and resulting 3D models allow stage designers to visualize and adjust elements like props, backdrops, and lighting before the actual build. Designers can use LiDAR data to guide the installation of stages, lighting, and other infrastructure accurately. And with highly accurate data at their fingertips, event producers can make on-the-fly adjustments based on real-time LiDAR scans during the setup phase. 

Even more intuitive is how these models can be integrated into augmented reality (AR) applications to provide virtual previews of the stage and venue setup. (More on AR and live events here.) 

Stage visualization in a point cloud of stadium at Oregon State University

Lighting and Effects

Ever wonder how light shows at concerts and other events can be so immersive? It’s all about understanding the 3D environment. LiDAR data helps lighting designers understand the exact dimensions and layout of a venue, enabling precise placement and programming of lights. 

The same understanding of a venue also extends to projection mapping, where visuals are projected onto surfaces in a way that aligns perfectly with the physical space. 3D LiDAR data is quickly becoming essential for this workflow. 

Point Cloud with Rigging Anchoring markup in CAD overlaying it in AutoCAD

Broadcast, Camera Placement, and Virtual Sets

For many live events like sports games, the broadcast is where the most viewers will see the show. LiDAR scans assist in determining optimal camera positions for live broadcasts, ensuring the best angles and coverage.  

This also has a similar impact on filming. Virtual sets also benefit from LiDAR data: in combination with green screens, LiDAR data can be used to create virtual sets that blend seamlessly with live footage and with filmed events. 

LiDAR: Your Single Source of Truth

There are other ways to accomplish all of the above, but only LiDAR data provides a single source of truth for 3D modeling and understanding of any live event space, whether it’s a small club, a public installation, or a huge stadium. Data from Leica Geosystems laser scanners can also be used in Leica CloudWorx, which has integrations with common design tools like Vectorworks and AutoCAD in media and entertainment. 

For anyone working in event production that requires measurements of the space, laser scanning gives a highly accurate context in which to place their work—whether it be sets, stages, AR graphics, lighting, or cameras—to create the best effects and overall presentation as possible. 

Want to learn more about our scanners for live events? Let’s connect.

Mesh from point cloud of stage

Want to learn more?

Check out these related posts.

Mesh of Berlin Theater

How LiDAR Data Supports Workflows for Immersive Live Events

Imagine being at a concert and during your favorite song, LED lights rain down from the rigging above the stage in a perfect geometric display that lines up exactly with the stage. Or watching one online—say, a DJ Snake concert at Coachella—and seeing incredible AR graphics that burst out from the screen and seem as if they’re part of the festival itself. It’s all part of the show, but how does it work?

Read More