- Share this article on Facebook
- Share this article on Twitter
- Share this article on Email
- Show additional share options
- Share this article on Print
- Share this article on Comment
- Share this article on Whatsapp
- Share this article on Linkedin
- Share this article on Reddit
- Share this article on Pinit
- Share this article on Tumblr
More than 300 volunteer filmmakers from around the world, working with Amazon, Google and Microsoft cloud services and dozens of tech developers, are wrapping an ambitious effort to examine the potential of cloud production. Planning started in February 2020, initially with the idea of creating a single short using the cloud, but a month later, the approach went from experiment to necessity as the world grappled with COVID-19.
The effort, led by veteran color scientist Joachim “JZ” Zell through the Hollywood Professional Association, grew as filmmakers in more and more countries found cloud collaboration to be critical to working amid the pandemic. The result is six shorts made by using the cloud, filmed in five cities around the world — Hollywood, Dubai, Brisbane, London and Mexico City — plus with an animated entry from Ulaanbaatar, Mongolia. Postproduction involved collaborators from the countries involved in the shoots, as well as additional countries including Brazil, Colombia and Lithuania. Additionally, diversity is a priority and each short was directed, lensed and/or produced by women filmmakers.
The broad takeaway: Cloud production has arrived, though the project identified areas where more technological advancement is needed. “The cloud is part of our future. It’s a big game-changer,” says Mandy Walker, an Academy Governor for the cinematographers branch, who recently wrapped production on Baz Luhrmann’s upcoming Elvis movie. Walker served as supervising director of photography on the poetic short Tangent. Deliberately, she never appeared on set. Instead, she monitored the weekend shoot live from her computer using cloud-based services to collaborate with others involved.
Each production implemented COVID safety procedures and used different combinations of technologies, allowing the participants to experiment with numerous workflows that enable remote, cloud-based production tasks including live review of camera footage, dailies, and participation in editorial, color grading and mixing sessions. The experiment involved the Amazon, Google and Microsoft clouds, as well as a range of technologies from participating manufacturers including 5th Kind, Adobe, ARRI, Avid, Bebop, Blackmagic, Colorfront, Evercast, Frame.io, Moxion, Sohonet, Teredek and Teradici. The shorts were also made with the collaboration of many companies from VFX house Framestore and Skywalker Sound (which mixed four of the shorts).
Tangent was created by writer-director Ruby Bell, working Walker, who is also Ruby’s mother. The short was lensed over one weekend in Brisbane and involved numerous members of Walker’s Elvis crew. Walker and Bell say they were intent on hiring a diverse crew. Says Walker: “I believe in creating more opportunities and diverse crews, and the way to do that, I feel, is to bring people up.”
On Tangent, the co-writer, producers, editor and composer were all women. The project also created opportunities for crewmembers such as Elvis B camera operator Jay Torta, who served as director of photography. “We could bump [crewmembers] up a couple positions to the job that they ultimately want to do,” Bell explains.
Following the shoot, the Tangent filmmakers used the cloud to collaborate during postproduction. The short was edited and scored in Sydney, with color grading from Dolby’s Vine Theater in Hollywood and sound mixed at Skywalker Sound in Northern California. “It was a good way to test these things, and it worked pretty seamlessly,” Walker says. “I’m going to be much more relaxed about using their systems [on future productions].”
Some relatively new technologies were also part of the test. For instance, in Dubai, the filmmakers put a prototype of Blackmagic’s 12K camera (now available) through its paces. In Hollywood, the crew used Frame.io’s new Camera 2 Cloud system to send proxy (low resolution) files of the footage to the cloud, allowing remote editorial and postproduction to begin.
The Hollywood production, horror-comedy Found Lederhosen, directed by (and starring) Barbara Wilder, took place on an LED virtual production stage at ARRI in Burbank, with DP Chris Probst using ARRI-developed tools to operate the camera remotely, from a separate room in the complex, in order to limit the crew size on set. Additionally, a LIDAR scan of the ARRI stage was used by Digital Film Tree to create a 3D “safety viz,” which is effectively a previsualization of where the crew would be positioned on set in order to follow COVID safety procedures, implemented by the on-set COVID compliance officer.
According to Zell — who is vice chair of the Academy of Motion Picture Arts and Sciences’ Academy Color Encoding System project (ACES was used as part of the testing) — the cloud is ready for Hollywood production, though a key takeaway was that when it comes to uploading full-resolution data to the cloud, more technological advancement is needed. Initially, per the plan, “the cinematographers would take the magazine [with the camera footage] home at night, upload [full-resolution footage] to the web, and in the morning they could delete them and record to the magazine again. This failed everywhere. Just not in Brisbane, where from the beginning [the plan] was to download the material at a post facility,” he explains.
“The bandwidth isn’t there yet. Don’t try this at home,” Zell adds, warning that bandwidth requirements are only expanding as camera data grows in file size (the shorts were lensed in 4K, 6K or 8K resolution).
“The easiest part to overlook of these new cloud workflows is the transmission process,” agrees Frame.io senior vp innovation Michael Cioni. “Even at a studio like ARRI, the Wi-Fi network might be fine in the morning, but as traffic in a building increases or people start uploading and downloading, the general traffic starts to have a big impact on the important transmission of assets. Also, as we all know, Wi-Fi has limited range. So the key thing is to take the LTE network that is super prevalent worldwide and make sure we have reliable, amplified signals that are capable of moving high-quality video.” Cioni says Frame.io is currently experimenting in this area with L.A.-based Sclera Digital, led by CEO Willis Chung, a former Local 600 DIT, which is working with industrial-grade telecommunications tools.
The completed shorts will begin to roll out today, initially as part of the HPA virtual Tech Retreat program.
Sign up for THR news straight to your inbox every day