That's not my quote, but I sound smarter attributing it down here instead of somewhere more prominent. I want to tell you about virtual video production and how it is changing the world - but just not today. And that's why we are quoting William Gibson above.

Take Off Studios asked us to create a pirate ship set for a video game launch video. This wasn't going to be your costume shop har-har-matey pirate setup. Rather an intricate 17th-century Malacca Strait pirate ship deck with geography and period-specific treasures that were more valuable than gold at the time- like spices. And some antique Chinaware because you needed something fancy to put your stolen spices in.

After a few discussions, we narrowed our production options to these three:

  • A Virtual Production with the Unreal Engine
  • A Green Screen with motion tracking using the Zero Density Engine
  • Kicking it old school with a built-up set in a green screen studio

We chose the last option because the future had not arrived evenly.

We will get into more about why we chose this method and more importantly, why we didn't go with the game-changing technology that was used on The Mandalorian. But first, let's take a look at this amazing shoot we put together!

Our senior editor, Jon looks out at the beautiful sunset at sea that only his beautiful mind could see.

We worked with the veteran (and very senior) Art Director, Junior, whose team fabricated a wooden mast, wooden deck, and a ship's wall using the right amount of "distress" to make it look like a ship that had seen many a plunder.

This Shein haul was a secret mix of silks and spices

Junior and his team then dressed the deck up with barrels, silks, and other props that were to be expected in the gameplay. (and also a 17th-century Shein haul)

We started with a blank slate. Ok, it was not blank. It was green. Junior (Art Director), Jaye (Producer), and Lok (DOP) contemplate life.

The drawback of this approach is that you can't fully visualize the set, lighting, and camera placement until everything gets set up. (And calling it visualizing then is silly- it's just seeing at that point. )

So measurements and mock-ups help. But it's mainly down to experience. We'd done enough green screen work to know (such as this recent project for Epson) where the pitfalls were and getting an even exposure across the green was key. (IYKYK). But knowing if we'd have enough room for the set, lighting, and cameras was the challenge.

But shiver me timbers, (translated from Bahasa Indonesia) the shoot went off smoothly with crew and clients from all over the world sitting in; both in-person and virtually. We used OBS to give everyone a live look at the set with a rough key-out of the sky and sea with motion included - in real-time.

Now let's get to the decision made- and explain each approach and its pros and cons.

  1. Virtual Production with the Unreal Engine
Image courtesy unrealengine.com & ILM

The Unreal Engine comes to us from video games where it is used for its amazing ability to render a 3D world in real time. Add in a camera that is motion tracked and you have magic - you can put any background behind the actor on an LED screen. The 3D background "world" moves with respect to the camera's shake/movement. And this is at the heart of Virtual Video Production.

Image courtesy unrealengine.com & ILM

Pros

  • The scene can be lit as if on location, not worrying about affecting the green screen lighting
  • You can "build" any set, no matter how big since its all a 3D world
  • Easy to "change locations" since you just have to change what's on the LED screen.
  • Reflective surfaces are not an issue. In fact, reflections from the 3D world can be cast onto shiny surfaces adding realism. ( a strict no-go in green screens)

Cons

  • Expensive studio costs - a massive LED wall (or three) and the roof are multi-million dollar investments.
  • You have to build a 3D world which can be tedious.
  • As of the time of filming, Singapore only had one studio equipped for a shoot like this. And surprise, surprise, it was/is expensive. There are more being built as we speak so this will likely sort itself out.

2. Zero Density Engine

This is an in-between solution where you still shoot on a green screen, but get a comped-in real-time result on a monitor on set. The camera tracking is excellent and you can place objects in 3D space in front of and behind the actor making for a believable set. This is an excellent solution for live events and launches.

Image courtesy zerodensity.tv/

Pros

  • Cheaper studio costs. No LED walls or ceilings are needed. Just a pre-lit green screen studio optimized for a good key.
  • Flexibility in post - you can "change the world" after the shoot since it was not captured on camera and is simply a 3D project.

Cons

  • Lighting limitations - you have to prioritize the green screen lighting so only safer lighting setups are preferred. While your camera movements will look seamless, the lighting on the talent might not match the scene well.
  • Studios tend to have setups with cameras, crew etc that are optimized (rightly so) for live broadcasts. Not so much for cinematic shots.

3. Classic Green Screen or Chroma Keying

It's time tested and it's been around longer than I have - which is saying a lot. You do have to spend a bit more of your resources on lighting the background to make sure it's well exposed (a nice bright green but not so bright that it bounces back onto your actors). The rest of your lighting is independent of the green screen stuff. Which can be quite a bit of light on set. But it can be worth it. As it was for this project.

The monitor is your friend. We shot with 3x Red Komodo cameras with each camera's waveform scope being used as the final check.

Pros

  • Fewer moving parts compared to the Unreal Engine virtual video production. No lens calibration or camera motion tracking is needed. Hence you have more control over your shoot.
  • A very versatile method that is cost-effective. Just trust in your scope for exposure matters.
  • Limitations make for more efficient shooting - you have to avoid some colors, watch for every reflective corner of the set, and ensure your lighting is perfect (according to your scopes) so you end up paying a lot more attention to your set.

Cons

  • A "bad key" can multiply editing resources. You go straight to the naughty corner where you have to"rotoscope" your scene frame by frame - where possible.
  • Colors in your frame that can come close to green (or blue) are risky. The light temperature can also push some elements in your set in that direction.
  • Finer things, like hair, are tricky business too. But with the right lenses and resolution, you can plan for them.
  • Motion blur can make moving objects get cut off during the keying. A higher shutter speed is useful here.

Conclusion

I can imagine the day when we pull up at a virtual video production studio and pick pre-loaded background worlds from a database and the lighting guides that come with them. We will simply show up in a studio and film multiple "locations" a day, without the hassle of shifting equipment and crew, "art-ing" the background, or worrying about the weather/time of day. Our imagination will truly be set free. As will our budgets.

Arrr, until we have these solutions made more accessible, we will have to stick with what we know well -even if it's the harder option to master.