Art & Entertainment

Transforming Cinema With Virtual Production

In a few years, virtual production won’t be qualified as such; it will simply be called production. Because that’s how production will happen.

Getting your Trinity Audio player ready...
Where Magic is Made: Annapurna Studios’ virtual production stage
info_icon

Imagine the following scenario: You’re a filmmaker who has to shoot a complicated action sequence in a remote jungle with a big movie star. Normally, you’d travel to such a location with your crew, spend money on stay and travel, and worry about other logistical hassles that accompany shooting on locations. But now, with a technology like virtual production, you don’t have to. A technology that’ll not just change how films are made but also conceptualised. But before I get into its intricacies, let me first tell you how it all began.

Digital technology made filmmaking affordable and accessible. Before digital, only a certain kind of movie would even get distribution—or a theatrical release. But now that’s no longer the case: if you’ve made a solid film, you can get it out there. Around 2016, LEDs and movies started to intersect. We, at Qube Cinema—a firm providing end-to-end digital cinema technology and solutions—began building LED cinema systems, deviating from our previous focus on projectors. During that process, we noticed that LEDs had marked their presence in production. That intrigued us, the opportunity to get into production technology. Besides, over the last five years, the quality of LEDs has improved by leaps. So, when you’ve a high-quality and large image on an LED wall, you can place actors and props in front of it to create a seamless blend when viewed through the camera. The audience should look at it and feel that it’s real and authentic—and not something powered by an LED.

We, at Qube Cinema, are essentially cinephiles, folks who love storytelling. We even made and produced a few movies, such as the bilingual drama 180, the Carnatic concert film Margazhi Raagam (featuring T M Krishna and Bombay Jayashri), and a unique musical concert, One (collaborating, again, with Krishna). These creative pieces intended to illustrate our key goals: improving quality, maintaining accessibility, and proving that something pioneering can be achieved.

Embracing virtual production is part of that goal. But that’s in the realm of image-making, what about filmmaking? Because even if the LED panel successfully reproduces a photorealistic image, the illusion breaks if the camera moves. That doesn’t happen in virtual production though because, among other things, it incorporates the parallax effect. If you’ve ever travelled on a train and looked outside the window, you’d have noticed that distant hills seem to move slower than nearby objects, such as telephone poles. In the context of virtual production, parallax becomes crucial because it provides depth and realism to the scene. Now the images displayed on the LED panels come from game engines, such as Unreal Engine, a popular choice. It takes the position of the camera as an input and constantly renders different perspectives based on its movement, achieving parallax. And Unreal does all of it on the fly. So, for example, pulling the camera back reduces the size of the object on the LED wall or, say, panning left or right rotates it. This real-time rendering is why it’s called ‘real-time’.

Storytellers should familiarise themselves with tools like Unreal Engine—which is available for free—to experiment with their ideas. A reasonably beefy computer at home, for example, will allow them to create entire scenes, design camera movements, and incorporate digital humans into their virtual environments. This tech also demands a shift in mindset for creators. Filmmakers must bring on board a director of photography and a production designer familiar with virtual production. The latter plays a crucial role in seamlessly blending the virtual and the real elements, imbuing the scenes with depth. For us, that’s a key challenge: to convince creators to invest time and effort during the pre-production stage. Some struggle to adapt due to financial constraints—and some simply don’t want to change. But if they’re willing to spend more time to prepare, the results will be immensely rewarding.

Since this technology is so new, not many are familiar with it. That can create a talent gap. To bridge it, I suggest introducing virtual production education early. In fact, even without a filmmaking background, individuals can benefit from learning Unreal Engine. And upcoming tech such as the Apple Vision Pro headset—a mixed reality piece that blends digital content with the physical world—indicates how our world is becoming increasingly virtual. The next generation of creators should, as a result, get used to 3D content creation tools. Because in the long term, traditional roles may either evolve or get eliminated, particularly those related to physical set construction. So film schools and institutions teaching visual communications should incorporate virtual production into their curricula. Some colleges, such as New York University, already offer degrees in virtual production. Academia should show urgency because virtual production education will help avoid ad-hoc approaches and reinventing the wheel. I’m concerned that without proper mainstreaming, the craft may not advance, leading to poor production values in films.

Virtual production entering the mainstream and the rise of Artificial Intelligence (AI) share an intimate relationship. Their convergence has resulted from the advancements in computation, making them both feasible. Qube Labs, our research arm, is actively exploring the intersection of AI and virtual production, focussing on Generative AI. This technology is already helping greatly with creating concept art but we are beginning to see how Gen AI can produce near final virtual backgrounds. As this continues to mature, we are going to see more of “final pixel” work happening just on the computer perhaps even removing the need to project anything on a big wall. Performances of live actors can be captured separately and blended with virtual elements that can range from backgrounds, props and even virtual humans.   

Now let me address the cost implications of virtual production stages and the current state of the industry. This conversation runs into a paradox: wanting to commoditise virtual production while acknowledging the current price challenges. The goal is, of course, to reach economies of scale, making virtual production stages more affordable across the country. The other challenge is with creating virtual backgrounds at a high enough quality. The very limited pool of virtual artists who can accomplish this makes it rather expensive. Besides, the current marketplaces offering digital assets—which are displayed on the LED wall—take a long time to produce. But you can also justify the high cost by looking at the current capabilities that allow productions to create complex scenes without the need for extensive travel to external locations.

Filmmakers should, however, consider shooting specific sequences on virtual stages rather than their entire films. Imagine big stars splitting the shoot between different virtual environments in the same location. It provides a huge financial incentive. That’s the transformative potential of virtual production. We believe that, in a few years, virtual production won’t be qualified as such; it will simply be called production. Because that’s how production will happen.

Rajesh Ramachandran is the Chief Technology Officer and President at Qube Cinema

(This appeared in the print as 'Transforming Cinema')