Virtual Production (deep dive) — Technology behind Avatar and Mandalorian is the key to the future of filmmaking and content creation.

Film is a medium that has always been a partnership between technology and storytelling. From its earliest days, the foundation was a technical breakthrough, which was the moving image film camera. And stories began to follow and explore what that camera could do. And the whole century we’ve been dealing with the legacy of cinema has been a dance between innovation and filmmakers being inspired by those innovations, and then opening up the stories that they could tell as the result of those innovations, and then coming up against limitations and then innovating their way out of those innovations with new tools.

The legacy of cinema is relatively short, a lot of people who’ve innovated those technical breakthroughs are still around and available to talk to.

— Jon Favreau

(filmmaker and creator of pioneering virtual production films like “Jungle Book (2016)”, “The Lion King” and Star Wars tv-series “Mandalorian”.)

What separates Virtual Production from CGI?

Virtual production is the combination of physical and digital production. Using physical filmmaking equipment, actors, sets, even locations, with digital augmentation at the same time

In the early 1990’s CGI started to become popular. Movies like Jurassic Park, Metrics, Terminator got created because of it but the rendering time with the CGI tech was way too much for the filmmakers to be able to create effectively. And you have to wait till the post-production to get an idea of how those computer-generated visuals will look like.

That’s where the gaming technology comes in — even though gaming technology and filmmaking share many technical and creative inspirations, they mostly stayed on separate paths.

In Virtual Production, you’re not separating the core creativity of filmmaking for months (the gap between shooting & post-production). Because before you start shooting with actors, you build the world you’re going to shoot in.

Building that world comes under the purview of the Art Department.

Art Departments in Virtual Production

  1. The physical art department
  2. The digital art department
  3. Virtual art department

Virtual art department unites both — It is a spectrum between digital and physical art department — by creating real-time digital assets that can be used in virtual production.

All of these art departments operate separately but are headed under the vision of the Production Designer.

Virtual art department will build both — physical and digital assets — that will assist a director in blocking scenes and a cinematographer in framing the shots during virtual scouts.

The process of virtual scout is called “Techvis” — Where you have to wear VR goggles and tour the digital world to identify the parts of a scene that’ll be need to build by the physical art department and then the parts of the scene that’ll either be accomplished in live-action (on set with the actors) or by visual effects artist in the post-production.

Virtual Scout — Techvis technology

Virtual Production Fundamentals

  • Virtual cinematography
  • Performance capture
  • Traditional filmmaking interface

You can use traditional live-action cinematography equipment as an interface — by building the assets with the real-time game engine technology — For that previsualization process is employed.

Pre-visualisation — In Virtual Production Pre-vis you build visual digital prototype of the set that creative team member can access via VR goggles.

(storyboarding is the most common technique of pre-visualization currently but these are sketches or animated graphic images that only reveal a shallow amount of details compared to the virtual production which enables you to see the world in 3D with 360° access to the set.

Pre-visualization via storyboarding
Pre-visualization in Virtual Production

Your pre-vis team can employ virtual production techniques for building the 3D visual world (that you can access by VR goggles) to bring tactile hands-on experience to the process. It makes collaboration easy and creative as the crew is able to understand the scene visually in great detail before filming starts.

Virtual Production is still in its early days, so the most of crew is new to the technology and comes from the physical production background. In the previsualization process, they are able to understand the process of virtual production better. Because you can scout and block the scenes as you’d do in the physical production.

Virtual production Techniques on a live-action set

Real-time camera tracking and compositing enables you to see what your shot will look like while shoot is going on instead of months later.

  1. Real-time assets (reference quality)
  2. Real-time camera tracking
  3. Real-time compositing (reference quality)

In this approach you’ll use blue or green screen for parts of the frame that you’ll replace with digital visuals.

You’ll record the live footage through the camera using one of many camera-tracking-systems so we know where the camera is. Then we just render the video feed out of that real time engine and use real time compositor to replace the blue and green screen.

This technique is also referred as simulcam — because of the camera’s ability to work both in the physical set and also in the virtual set. As a filmmaker this allows you to see both composited together.

Simulcam

You can also use real-time camera tracking and compositing to add a computer-generated foreground to a live-action plate (plate means footage that is shot on set with green or blue screen background).

But if you want to get fancy like the crew in Avatar. They started the process in around 2005 — They performance captured the character live (to either direct an actor on or off-camera) — then stream that performance into a digital character and then composite that digital character together with a live action plate at the same time.

Virtual Production Stage (LED wall / Volume technology)

  1. LED screens/projection for In-camera finals
  2. Real-time camera assets (show quality)
  3. Real-time Background perspective shifts with the camera

If you are ready to put a little work into the assets and a lot more electricity into the stage, with the possibility of skipping some post-production altogether, then LED stages are the latest option to consider.

This technique takes a real-time game engine environment and connects it to an array of LED panels/walls. Then it uses the camera tracking information then renders the perspective appropriate to the camera. When you do this right, the walls and even the floor can just disappear to the audience.

Getting the light matched between the foreground and a background is key to pretty successful visual effects shot.

The benefits of using an LED stage — We get the foreground and the background and the integrated lighting (lights that LED panels project onto the set & characters) to marry them together. All these successfully done together maintains the illusion for the audience.

LED-Wall BG on a Practical Set

This approach brings some added perks — It saves from the headache of big location changes in physical production.

An LED stage can take an entire crew and its gear intact across the world or back in time in just a few minutes.

The Process of LED Wall Virtual Production in Star Wars Series Mandalorian

And if at any point you decide the digital virtual effects content on the LED screen isn’t living up to your expectations — with a button click you can switch that section into a green screen and keep using the other advantages from the rest of the stage, like lighting, the creative context for actors. Maybe when you send a take to editorial for visual reference before you go pick up that safety shot.

Creators can replace the LED background with a green screen if they are not happy with the visual outcome. But the rest of the LED wall can be used for integrated lighting and as a reference for the visual artist who will improve the look in post-production.
In

Future of Virtual Production

Moore’s law — Moore’s Law states that we can expect the speed and capability of our computers to increase every couple of years, and we will pay less for them.

You can’t separate technology from content creation. So by adopting certain workflows from live cinema, and certain workflows from animation, and certain workflows from gaming — There’s so much overlap and evolution and cross-pollination because the technologies are starting to merge — there’s confluence. It used to be there’s certain things that were good in movies and certain things good in games. But those divisions are becoming harder and harder to draw distinctions and it's becoming more about narrative style.

Once these new tools (consumer-facing virtual cinema products) are available to the individual people or small companies and as the price point goes down and Moore’s Law kicks in, more people will adopt these tools to explore & experiment with it and they will come up with cool things to do with them and then everybody will start to benefit from those innovations.

— Jon Favreau

Virtual production & Metaverse?

Metaverse technology does have an awful lot in common with virtual production. They both generate scenes in real-time, and those images are keyed either to the movement of a camera or an individual viewer. Up to a point, the technology is very similar, largely because what the two have in common is game engine — or, more accurately, what game engine will become.

Nvidia’s Omniverse shows what can happen when metaverse-type thinking is applied to real-world problems.

Virtual Production is likely to become both a supplier of content for the metaverse

Virtual sets & props can be copied and pasted, they can be built in a room on a computer and sold online to masses who will then use these products to build something new of their own.

How the metaverse will change filmmaking

Metaverse is a simulated reality that is planning to take the place of the internet and for metaverse to build you need digital sets & locations, props, and artists to visualize the world and technicians to construct that virtual world. That’s where the metaverse and filmmaking will confluence in the future.

You could shoot a location halfway across the world, or an intergalactic planet that doesn’t even exist. You could shoot at such locations from home or any descent local studio that is geared for virtual filmmaking.

In a virtual set, you only need a prop that character touches, everything else can be simulated on a computer. You can create million-dollar sets and locations for a fraction of the cost with virtual technology.

The actor and the props in color are the only real things on the set, against the green screen background.
After compositing a real-time virtual environment of a college dorm.

Soon you’ll have your VR goggles sitting next to your smartphones. Once it becomes accessible to Individuals, they will come up with so many new creative ways to explore and further the technology that we can’t even imagine right now.

The technology can affect almost all types of other industries such as it can be used in education for practical learning — where the virtual model of a particular machine or construction building can be easily explained to the students who will see the 3D models of them while wearing VR goggles.

Or the youtube like small content creation industry where an influencer/content-creator could shoot with blue or green screen background and add whatever visuals related to the content. These visuals/virtual environments will be bought or copied from the metaverse resources available online.

Thanks for reading !!

If the article provided some value to you, do share and follow me for more such a content at https://medium.com/@rahul09

I regularly write about filmmaking and all aspects of content creation among a few other things.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store