Can Hollywood Go Virtual After Coronavirus?

Millicast’s main base of operations is in north Los Angeles, just a short journey from Hollywood. Millicast has been working with a number of different Hollywood studios to enable an amazing transformation in the way that movies and TV shows will be produced in the future.

This transformation has had a big impact on remote editing and production teams dealing with COVID and the“Shelter in Place” ordinance.

The industry needs a solution offering broadcast-quality features while enabling consumer-grade workflows.

This transformation has had a big impact on remote editing and production teams dealing with COVID and the“Shelter in Place” ordinance.

Background

CNBC produced a video which explains the journey the industry has had to embrace to transition to a truly digital movie making workflow:

CNBC: Can Hollywood Go Virtual After Coronavirus?

This documentary tracks the journey from Jurassic Park to Avatar, the remake of the Lion King, and The Mandalorian where movie production has evolved into movie sets with virtual characters and virtual sets:

The fusion of computer games and movie making is all but complete. All stages from pre-production, pre-visualization, storyboarding, to production and post-production are now driven by virtual visualization.

This has been driven by the adoption of computer game technology from (for example) Unreal Engine:

Creativity has moved to post production in a digital world where anything can be manipulated. Teams of creative artists, and editors are connected by high capacity networks to large servers and ultra high definition monitors and software. Co-located and working in real time, a movie is no longer “cut” from daily rushes but created in real time as the movie is shot on a digital set.

And here lies the problem.

Hollywood has all but been brought to a standstill by the Coronavirus Pandemic. Shelter in Place directives have dispersed the editing and production teams to work from their home.

But working from home demands that you link up to your local ISP, suffer the vagaries of domestic Internet connections and work on your consumer grade computer (however high the spec, it’s not your studio kit).

When the film production process demands that you are in a digitally connected team — all working in real time — you are suddenly confronted with problems.

Let’s consider what was going on in the studio before the lock down. You were fully digital — connected in real time to your co-workers during:

  • Pre-visualisation: you were testing visual scenarios, and location alternatives
  • Pre-production: you were testing possible virtual scene variations — colour, positioning, camera angles, and illumination, before shooting the scene.
  • Production: you were shooting and digitally mixing real and virtual components on virtual sets
  • Post-production: the most demanding stage of production in terms of visual quality, the finished movie is being created, adding special effects and editing for colour, sound, and impact — with large editing teams working together, in real time, to create the final scenes.

It is in post-production that the “heavy lifting” takes place — creating the finished movie in real time. And here the term “real-time” used to refer to real-time renderings, i.e. the capacity to compute the final scene with all effects applied, with the level of quality and the resolution the studios expect for validation and release.

Real-time rendering used to be a challenge, even in a normal studio environment without the challenges of remote working. Today people equipped with the right hardware and the right connections between them can function and create the finished result before their eyes. It still requires that the teams have access to dedicated networks, studio-level camera and display hardware, and powerful servers located in one or more linked on-site studios. A far cry from your usual Working-From-Home (WFH) setting.

As well illustrated in the CNBC documentary, post-production is a creative process that requires teamwork. The technology used today is great for “local but dispersed” work from one studio or set to the next, as though team members standing and interacting with each other in real-life.

However, the system fails when confronted with the current WFH / shelter-in-place remote environment. In the latter case, people are at home, with consumer grade hardware, desktops or laptops, and public internet access.

The show must go on…

And for the show to go on, the studios must find a way to enable a production environment for their WFH teammates. That is, to create the same output as would be achieved with the high speed, high capacity networks and expensive applications.

This is how the challenge was explained to Millicast by a Senior Executive of one of the top 5 Hollywood Studios:

“Whether in the studio or WFH, post-production and validation of content requires the same holy trinity.”

That is, these THREE requirements:
1. Movie Production Quality
2. Near-zero latency for team interactivity and collaboration in real-time
3. Affordability, particularly because everybody needs the solution at home.

REQUIREMENT 1: QUALITY
In terms of quality, whatever you are using at home (desktop / laptop, browser or application) you have to deliver “movie production quality” — that is, it needs to handle the full breadth of colour and sound.

Colour measurement refers to three things: colour depth, colour accuracy and colour sampling:

Colour Depth
The base is 8 bits, and then you can have 10 and 12 bits, the higher the better. At 12bits, you are using the full range of White, Red, Blue.

Colour Accuracy
To work at an acceptable resolution for post-production, the stream needs to be rendered and viewed as a High Dynamic range image (HDR) which typically corresponds to the physical values of luminance or radiance that can be observed in the real world. Note that HDR can only be achieved with a minimum of 10 bits colour depth.

Colour Sampling
Without going into the technical details, there are 3 levels from normal to good: 4:2:0, 4:2:2, 4:4:4. The numbers relate to Chroma components of the image. 4:4:4 is the original size of the image and the best possible Chroma sampling.

And then there is audio quality — the bare minimum is stereo audio, but the standard nowadays is 5.1 or 7.1 surround sound.

Our goal is to eventually offer these features natively in the browser, working with browser vendors to build-in this support. We are enabling these features with playback devices like Apple tvOS:

This demo shows how to install and configure Millicast Player for Apple tvOS to playback WebRTC streams in real-time with less than 500ms of latency globally at scale.

To summarize, the application must support professional grade input support (SDI), 10 or 12 bits color depth, HDR, 4.4.4 chroma sampling, and surround sound, at interactive latency to be adequate for post-production work. WFH means that this application must now run on a consumer-grade WFH desktop or laptop, over the public internet.

REQUIREMENT 2: NEAR-ZERO LATENCY
Team collaboration: Remote post-production is not a scale game. It’s a small group of engineers / editors / designers working from home on a scene together. The challenge is to keep the quality (as described above) at a real-time latency for true interactivity and collaboration.

It is pointless me making an edit to a scene while it is rolling, if co-workers only see it three to five seconds later. Today, existing solutions either provide good quality at an unbearably high latency (2–3 seconds at best), or lower latency but with an insufficient rendered image quality.

REQUIREMENT 3: COST
Existing remote production equipment (Studio Servers, Application Hardware & Software) are extremely costly. Some of the most used hardware encoding and streaming “boxes” cost as much as $50,000 a piece. Most require dedicated private networks installed at home. Not a financially attractive proposition for a full WFH post-production team — notwithstanding the performance limitations.

Enter Millicast

The reason why many Hollywood studios and Post-Productions companies all over the world are using Millicast is the capacity to provide the “holy trinity” of requirements described above in a WFH setting.

Some post-production solutions incorporating the current Millicast Platform already meet many of the requirements.

The Millicast Studio native app, with enhanced webrtc encoder and decoder optimizations, is planning to add support in real-time for 8k, 4:4:4, 12bits, HDR video, finally enabling post-production studios to WFH even for the most demanding aspects of their work (visual effects, colour grading, etc.)

Millicast Studio is a software application that provides real-time source and device capture, encoding, and broadcasting natively using WebRTC.

QUALITY
Using an extended version of WebRTC, Millicast currently supports up to 10 bits, HDR video quality, 4:4:4, and 5.1 audio, while staying compatible with the Chrome browser for rendering and playback.

COLLABORATION
Typical latency between team members is less that 300ms, enabling interactive collaboration in real time. Millicast also provides security and encryption in par with what studios already use today.

BANDWIDTH
Newer video codecs like VP9 and AV1 are used, which not only provide better compression (that is, Millicast uses less bandwidth, therefore reducing cost, and reducing the demands on a domestic network), and include extra resilience features to handle the unreliable quality experienced on the public internet access at home.

COST
Beyond capture and output, where your usual SDI input/output cards can be used, no special hardware or network access is needed to enable the full editorial, design and creative process to be undertaken by a sizable remote and dispersed editing team in a WFM environment.

While still limited by CPU and bandwidth available (do not expect to stream 8K at 1Mbps on a chromebook anytime soon), Millicast solution works on off-the-shelves PC or Mac computers like the MacBook Pro without external encoder or decoder.

Illustration of the codec quality setting panel from Millicast Studio.

Bottom Line

The reason why numerous studios are using Millicast is because it has the capacity of extending WebRTC to support the level of quality post-processing requires, and Millicast provides the delivery at interactive latency that their teams require, over the public internet.

Signup for a free trial and get started today.

The Fastest Streaming on Earth. Realtime WebRTC CDN built for large-scale video broadcasting on any device with sub-500ms latency.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store