Quite Brilliant’s Russ Shaw breaks down the attraction of virtual production and how the process really works

Q> Tell us what attracted you to virtual production?

Russ> I’ve been working in post-production for what is soon coming up for 30 years, (scary thought) and having recently sold my post company Nice Biscuits, I needed to decide on my next career move. Throughout the past few years we’ve been undertaking production as a side-line, so when I hit upon an opportunity of combining the two, I thought it was worthy of further investigation.

Some people are starting to describe it as the ‘great film renaissance’ because like we’ve seen over the years, our industry has a habit of not standing still when faced with challenges. Convergence, industry disruptors are always forcing new ways of working upon us, which inevitably leads to the death of the ones who are stuck in their old ways and survival of the inquisitive pioneers. I believe virtual production is one of these disruptors and it’s highly relevant to today as a tool for providing a flexible production environment but without the associated covid risks of international travel. It’s also a way of reducing crew sizes, but most of all it’s a way having a fully controllable shooting environment.

Q> Can you describe what virtual production actually is?

Russ> The simplest way to describe it, is being a modern day take on back projection but instead of using fixed cameras and projectors we’re using giant led screens with content that moves proportionally across the screen, respecting the camera’s movement and viewpoint. When it was first devised in the 30’s it was put to great effect for many years, but eventually it died out as a result of cgi and modern-day visual effects. With the advances in game engine technology and led screen technology the idea has been re-born and it’s only set to get better and better as computing power advances.

Q> Can you tell us how it works in principle?

Russ> There are a few key elements to a studio setup and configurations do vary depending on the size of the walls but very basically the methodology can be described as camera, tracking device, Unreal game engine, foreground props, led screens.

So, you have your camera, which is mounted with a tracking device, this provides xyz camera positioning and lens information to a server, which in turn tells another machine running Epic’s Unreal game engine that it needs to pan, tilt, roll, zoom, pull focus on a cg model that it’s displaying. Unlike a lot of cg experiences, you may have had, the scene is rendered in real-time, so it can literally be rotated along with the lighting in front of your eyes. This image is then fed to the led, which creates the illusion of depth or parallax to the lens of the camera. The technology is not fool proof and you can’t literally load a scene up with millions and millions of elements and expect it to playback smoothly but non the less with optimisation and serious hardware it is an impressive feat.

Q> On the test film you shot there was a large screen hanging above, what was this for?

Russ> By using a roof mounted led display with extremely high brightness (nit) values, we are able to display the lighting from the cg environment to illuminate our foreground talent and set so that the colour temperatures match and provide realism. In some of our scenes we simply showed the sky with drifting clouds but in the scene featuring the car we displayed street lamps and colour streaks which as you can see worked brilliantly over the car’s black reflective bodywork and windscreen. I think this in particular highlights the difference between shooting against green screen and LED as the reflections are just so natural. If you’d have shot traditionally in a green screen studio, you’d be having to pre-guess the foreground lighting or at best matching it from a reference backplate on set. You’d also be spending lots of time in post warping and tracking highlights back over the car.

Q> How long did it take to shoot your film?

Russ> We had eight scenes to shoot, some with multiple angles but we allowed for four scenes per day. I’m pleased to say we achieved this without any overtime and hit our one-hour setup and one-hour shoot schedule pretty well.

Q> Was the process similar to traditionally shooting?

Russ> The pipeline is interesting as it actually takes most of the post-production and re-positions it as pre-production.

In our case the environments that we used were pre-purchased models from the Unreal marketplace which is an Aladdin’s cave for virtual assets. We adapted and tweaked the models, finalising them in the week running up to the shoot, but we could have built the scenes from scratch in cg or even produced photo real environments using photogrammetry and lidar scanning given more time.

With the Unreal designers we decided on such things as what part of the set we would see and which elements would be visible or obscured. By placing a virtual camera and a placeholder actor in the virtual scene, we were quickly able to spin around our rooms move lighting and pre-viz accurately. From a prop master’s perspective, it was kind of like saying, we’ve furnished half the room for you, but can you now go and source the rest but for real.

After the backgrounds were finalised, we tested them with the kit being used on set, so we knew they’d be optimised and ready to work straight away. Other than the odd on-set re-framing adjustment the rest was similar to a normal shoot, apart from the speed of changing set or ‘virtual’ locations. Most of the crew would find themselves staying in the same positions as there wasn’t a requirement to rig or de-rig large amounts of kit and lighting, just the odd occasion where we wheeled in additional foreground lighting.

Q> What did you take away from shooting this?

Russ> I think there’s lots of discussion to be had about the benefits of shooting this way and although I agree it’s not going to work for every script, with some consideration and everyone being on-board from an early stage, there’s certainly a lot that could be easily adapted to suit this method of shooting. Although our film doesn’t show it, establishing shots could have been used to great effect, utilising either the same cg environments from different angles or ones that have been shot independently using one-man crews or even stock footage.

What I really took from this was that it has the ability to restart production, crew do not have to travel and you can achieve multi location setups in a fully light controlled environment, think all-day golden hour and you have a d.o.p.’s and still’s photographer’s dream. It works well for the actors too as they are in an environment which is more immersive to them because they can turn and feel part of the environment that they are acting in, which is not the case with a green screen set.

I should also probably add that we spent just three hours grading and a further three hours in online, which shows the majority was captured by the camera.