Meet virtual production, the workflow that’s transforming content creation

Virtual production describes workflows that use game engines, soundstages, and LED screens in place of traditional sets or locations. Originally the domain of the media and entertainment industry, virtual production is now used across many industries.


Virtual production techniques turn sound stages into immersive environments, like the futuristic shuttle bay from season 4 of Star Trek: Discovery.

Image courtesy of Pixomondo.

A sound stage with wraparound LED screens displaying the red, futuristic virtual environment of the shuttle bay from season 4 of Star Trek: Discovery.

Drew Turney

October 29, 2024

min read
  • For the media and entertainment industry, data and hardware paradigms have been advancing for the past few years, and productions are finding they add up to more than the sum of their parts.

  • Enter virtual production—a process that uses game engines, soundstages, and LED screens in place of traditional sets or locations.

  • This process saves time and production costs, enables remote collaboration, and allows earlier access to marketing assets like trailers for highly anticipated content.

  • As costs decrease, applications of virtual production are also popping up in fields like architecture, construction, health care, and sales for virtual tours, realistic and safe training, and other uses.

There’s a whole new way to make media and entertainment (M&E) content using game engines, soundstages, LED screens, and digital data. Welcome to the age of virtual production.

Avatar (2009) was the most visible early example; then, virtual production practices slowly became more common until the lockdowns of the COVID-19 pandemic made them essential. Today, productions can give actors costumes and props and then do everything else digitally, with environments and backgrounds built within software and broadcast on giant LCD screens in enclosed, weatherproof performance spaces.

This represents the confluence of several established tools and techniques—and artists and studios are getting onboard as costs come down and the results keep improving.

A long time coming

Virtual production can be used in everything from previsualization, where a sequence is animated in a low resolution to see if it works, right up to final render, where the high-definition backdrop is projected for in-camera capture.

Many elements of virtual production were established long before COVID-19, but restrictions around travel during the height of the pandemic gave the process its Hollywood moment, as the advances in technology were in perfect sync with the new normal of social distancing.

For example, when shooting Marvel’s Hawkeye, stars Jeremy Renner, Hailee Steinfeld, and Vera Farmiga acted out their parts remotely on soundstages, with software capturing their performances and importing the data into a single virtual scene.

It wasn’t ideal, but it was an effective proof of concept. “You’re definitely seeing a lot of remote collaboration,” says Chris Bannister, executive producer of virtual production at Industrial Light & Magic (ILM).

Director Jon Favreau, film crew, and MPC artists are wearing VR headsets and using VR handsets to collaborate in a virtual production environment.
Wearing VR headsets and using game controllers, director Jon Favreau, crew members, and MPC artists collaborate on visuals. Image courtesy of MPC.

Moving Picture Company (MPC) Production VFX Supervisor Matt Jacobs has also seen virtual production take off due to the focus that pandemic filming practices put on LED light stages—which also helped improve tools that already existed. “Virtual production tools and techniques were used on movies like Jon Favreau’s The Lion King well before the pandemic,” Jacobs says. “Things like virtual cameras, pre- and post-vis, had been around for a very long time.”

Game engines like Unreal Engine and Unity, a cornerstone element of the workflow, have long contained virtual production tools. Directors in film and gaming can control virtual cameras and gaming joysticks, and game engines can encode the movement of cranes and jib arms to follow on-set camera movement, creating the same movement in a digital environment.

Alejandro Arango, director of virtual production at Epic Games, the company behind Unreal Engine, says studios and filmmakers have been visualizing and capturing performances directly into Unreal Engine for years, streaming the data directly from mocap systems or tools like Autodesk MotionBuilder.

The integration of virtual production tools such as production databases, publishing workflows, and render farms with Unreal had been rapidly evolving prior to the pandemic. “And, of course, in-camera VFX [ICVFX] was all proven viable with The Mandalorian,” Arango says.

Finally in the spotlight

When the components of virtual production emerged back in the ’90s, they were prohibitively expensive, and CGI was highly manual, very blocky, and low quality. For many, it just wasn’t worth the trouble. But engineers, software developers, and animators continued to push the limits, making groundbreaking visuals more affordable.

For 2014’s Interstellar, director Christopher Nolan had LED screens set up outside the windows of various spaceship sets, letting the actors react to animations of space travel, wormholes, and the black hole Gargantua firsthand while filming.

Now, directors can also see live footage of animated backgrounds or CGI characters on set, and the cinematographer or visual-effects artists can remotely make tweaks to lighting, colors, and other variables. And it can all be done in real time as the scene is being shot.

Pieces of the puzzle

A true virtual production pipeline is composed of specific parts:

Pixomondo’s Toronto LED/virtual production volume is a large stage with a curving backdrop of many LED screens.
Pixomondo’s virtual production stage in Toronto is dominated by a curved wall of LED screens. Image courtesy of Pixomondo.

1. The stage

Since the advent of CGI, VFX artists have animated backdrops, effects like smoke or water, characters, props, and everything in between. Actors performed against huge green curtains that the animation software later isolated and removed, with artists digitally painting and inserting final visuals in postproduction.

With virtual production, LED projection puts the digital painting on set, to be seen and reacted to in real time. Digital elements are predesigned, rendered, and projected behind the performers so everyone can see the final on-screen render during filming.

The image projection can be on a single LED screen (essentially a big-screen TV) or an entire wall of LED screens. The world’s biggest such wall was completed in 2023 at Australia’s Docklands Studios, Melbourne, by virtual production provider NantStudios. The curved wall is 88 m (289 ft) long, over 12 m (40 ft) high, and comprises 6,000 LED panels.

A man holds a tablet showing the camera-corrected view of the distorted virtual production setup beyond.
In a virtual production setup, the distorted perspective of the backdrop is corrected automatically to match the position of the virtual camera. Image courtesy of Unreal/Epic.

2. The camera

Any digital or film camera can be recruited into virtual production—the secret sauce is behind the scenes. If you’ve ever seen a publicity still of a virtual production backdrop, you might notice the perspective of the image is outlandishly wrong. That’s because software syncs the camera’s position and viewpoint with the image being projected. In the early days, the processing had to be done manually, which vastly constrained the movement of the camera.

But today, the real-time tracking of the camera and lens is built in. “The data is enough to render the view with the correct perspective,” Arango says. “It’s then sampled using a UV map generated by the projection of a virtual model of the display geometry onto the plane of the camera so the render appears correctly.”

That means preplanning is crucial. James Thompson, virtual production and capture supervisor at VFX and virtual production studio Pixomondo, cautions that the director and crew have to make sure the wall coverage supports the desired shot angles and provides adequate lighting.

In fact, virtual production takes the way the eye perceives light into account to make a CGI backdrop look even more realistic. “The computer hardware renders an inner and outer frustum [aka a truncated pyramid shape],” Thompson says. “The inner frustum covers the main or ‘safe’ frame of the camera and has the highest detail and fidelity. The outer frustum covers the periphery and it’s typically rendered at a lower resolution with reduced detail. That maintains continuity in the scenery so as the camera moves, the environment appears continuous and immersive.”

An in-software virtual reality view shows tools used for scouting and navigating a virtual production environment.
Game engine software lets filmmakers use VR to virtually scout an environment. Image courtesy of Unreal/Epic.

3. The software

VFX artists create CGI using powerful 3D modeling, animation, and effects software—for example, Pixomondo employs Autodesk Maya, while ILM uses both Autodesk Maya and Autodesk 3ds Max. The assets built using these tools can be exported to a game engine like Unreal Engine, the most common software tool for generating the real-time imagery used in virtual production.

Just like game developers use the built-in tools of game engines to design landscapes, in-game physical laws, interactive elements like vehicles or props, and much more, film directors and VFX artists can use the same tools to build CGI worlds. The only difference is that instead of a player controlling where the character goes and what she does using a controller, the director chooses the action, camera angles, and conditions to effectively tell the story.

The benefits of virtual production

A scene from season 1 of Star Trek: Strange New Worlds shows actors dressed as Vulcans sitting in an elegant restaurant with an alien landscape visible through the large windows.
With virtual production, directors can place their actors anywhere in the world—or, as in this scene from season 1 of Star Trek: Strange New Worlds, anywhere in the galaxy. Image courtesy of Pixomondo.

Although its action happens on multiple planets and in space, The Mandalorian was filmed completely on LA sound stages and a small backlot. By contrast, the original Star Wars trilogy was filmed on locations from the US to the UK, Tunisia, Norway, and beyond.

This shows the main benefits of virtual production: speed and efficiency. Thompson says the real-time feedback of virtual production lets directors, cinematographers, and VFX artists instantly assess the effectiveness of shots. “It helps identify issues early on, potentially averting costly post-production fixes,” he says.

Bannister adds that the technology offers novel creative possibilities. “We’ve used it in smaller broadcast shows to solve immediate problems,” he says. “There was a scene in How I Met Your Father where a story point takes the characters to the Brooklyn Bridge, but it was going to be too costly and take too long to go there, so we came up with a way to show the Brooklyn Bridge very quickly on an LED stage.”

Another benefit is sustainability—Thompson says it cuts down the energy consumption of transportation, lighting, and heating/cooling large production spaces. “LED lighting and virtual sets that can be adjusted in real time let you be very energy efficient.”

And if you’re dealing with a highly anticipated series or movie, this gives earlier access to marketing assets. Trailer-worthy moments normally wouldn’t be ready until after final postproduction VFX, but capturing high-definition footage complete with in-camera CGI gives you hero shots immediately.

Beyond entertainment

The market for virtual production techniques is broad—and growing. According to a 360i Research forecast, the market size was estimated at $2.55 billion in 2023 and is expected to reach $8.42 billion by 2030.

In addition to scripted entertainment, virtual production has a solid position in fixed camera setup environments like news and TV sports broadcasting—and is applicable to corporate training and informational videos shot in small studios.

Forward-thinking users in other industries are also putting virtual production into practice. Architects and construction firms are using digital 3D designs of projects to let stakeholders “walk” around inside a representation of the building—either on-site or remotely using virtual reality (VR) headsets when the visual data can be condensed and streamed.

A man dressed in a sensor-covered motion capture suit and cap is being filmed by a camera crew.
Performance capture technology, a common component of virtual production, is increasingly being used outside of traditional filmmaking. Image courtesy of MPC.

Thompson worked with Sony on a fan experience at 2024’s CES trade show built with virtual production, and the response showed that immersive narrative experiences could represent an exciting market in interactive shows and entertainment parks. “Thousands of people flocked to the event, interacting in real time with characters like the Ghostbusters’ Stay Puft Marshmallow Man. The experience used capture technology and haptic flooring, and the environment and characters were taken from our real-time VFX Ghostbusters short film project in the Unreal Engine.”

Other industries using virtual production include automotive sales in virtual showrooms, and education in fields like health care and the military, where training environments often need to be as safe as they are realistic.

This wider use is partly because the cost proposition has flipped. The systems and technology behind virtual production aren’t cheap, but as camera technology, data processing, and display equipment come down in price, they become more accessible, opening up creative possibilities.

“Smart people have engineered better solutions to creative problems,” says Jacobs. “We used to shoot on film and now most cameras are digital. Computers, software, and rendering are exponentially faster. Data can be transferred much quicker all over the globe. Above all, people used to be much more specialized—now there are countless artists and technicians in this field who have multiple crossover disciplines. It creates competition and has a huge impact on cost.”

The latest thing

Hollywood can follow trends, but while virtual technology is certainly cool, it’s become an essential tool in the filmmaker’s arsenal, similar to ground-breaking advances like sound, color, CGI, and beyond.

Like those other tools, virtual production will find its place and will be more suited to some projects than others. Understanding a production’s needs will show if the project is likely to benefit from these tools and workflows.

It’s unlikely that virtual production will completely replace real location shoots any time soon—after The Mandalorian, the Star Wars series Andor was filmed completely on location, with some scenes filmed using StageCraft, ILM’s virtual production toolset, an aesthetic decision made by the showrunners and directors. As Jacobs says, “Virtual production has a place for certain environments and it can be useful on very technical shots, but not all locations are appropriate and not all productions can afford it.”

But it’s clear that virtual production is here to stay, and can bring new levels of cost savings, efficiency, and creative opportunities to many projects across industries.

Drew Turney

About Drew Turney

After growing up knowing he wanted to change the world, Drew Turney realized it was easier to write about other people changing it instead. He writes about technology, cinema, science, books, and more.

Recommended for you