- Share this article on Facebook
- Share this article on Twitter
- Share this article on Flipboard
- Share this article on Email
- Show additional share options
- Share this article on Linkedin
- Share this article on Pinit
- Share this article on Reddit
- Share this article on Tumblr
- Share this article on Whatsapp
- Share this article on Print
- Share this article on Comment
Every creator on earth now feels the guiding hand of AI.
On social media, TikTokers are rewarded with massive views for tailoring content to an algorithm that is meticulously designed to trigger dopamine release. In Hollywood, producers are rewarded with lucrative film deals for developing projects that feed the black box AI at studios and streaming platforms, which keep valuable viewership data insights to themselves. That viewership data is built via feedback loops created by recommendation engines reinforced by the very viewer behaviors they shape in the first place. It is value creation increasingly usurped by machines, and between TikTok and streaming platforms, the precious space that allows for human-first innovation is closing. TikTokification is metastasizing.
The Writers Guild is right to push for protections against AI, but nowhere are these protections more urgent than in the documentary and nonfiction space, where I have worked both as a producer and a writer.
The stakes are high, and creative careers hang in the balance. But the greatest threat to broader culture posed by ambient machinery isn’t the bottom-up, AI-generated art populating social media (think: Wes Anderson Directs Star Wars). It is the top-down, AI-powered platforming of art, which we’re already seeing across the media landscape — algorithms deciding, on a global scale, which stories to tell and how — and it is especially insidious in the realm of nonfiction.
“The danger is less about AI in the creation of documentary, the actual production, and more in the curation of it,” says Amit Dey, executive vp nonfiction at MRC, which has untitled Sly Stone and Rudy Giuliani documentaries in the works. “It’s one thing if human-made films are competing in the market against robot-made films. It’s another thing entirely when data in the form of artificial intelligence, or proprietary algorithms, shape the decisions around what human audiences are exposed to. In other words, what gets bought and when. What gets platformed and where. What stories get told.”
Media veteran and producer Evan Shapiro, who just headlined at MIPTV, says outsourcing accountability is a time-honored tradition in Hollywood. “From dial testing to focus grouping to ‘My kids didn’t like it,’ a certain species of TV executives have long ceded their greenlighting decisions to a range of third-party safety nets that protect them from actually making the choices themselves,” Shapiro says. “These devices allow execs to take credit when shows work, and easily pass the buck when they don’t. AI is simply the latest excuse fad.”
And yet, AI is already hard at work at every level of filmmaking.
At XTR, which backed Magic Johnson doc They Call Me Magic for Apple TV+, CEO Bryn Mooser has built a proprietary algorithm named “Rachel” to help guide his development process. He calls it a “zeitgeist machine” that combs through social media to see what’s trending and then focuses his development around those signals.
“I got a lot of shit about it,” Mooser says. “Then ChatGPT came and the world changed overnight. We had always been thinking of it as a tool, and as a tool it’s incredibly useful. What conversations are trending. What people are talking about. We built it so we could overlay that with historical data in the documentary business. What works, what doesn’t. Its application as a tool to enhance what filmmakers can do is incredibly powerful and important. And my hope would be that it’s embraced.”
It’s true as well that human executives still make the final greenlight decisions at these platforms, but with the growing wealth and power of AI-generated data insights — data insights that have been proven to drive viewer engagement, for better or worse — executive willingness to die on the hill of one’s own (human) opinion is fading. Why take risks on more novel concepts when, for example, the true crime genre is a sure-fire hit factory, according to the data? It’s human nature, especially in this job market, for an executive to cover themselves. I don’t blame a single one of them. But in Hollywood’s rampant CYA culture, now AI-powered, executives may be covering themselves out of existence.
Without smart (human) executive intervention, challenging our baser instincts as viewers to tap relentlessly on puppy videos, is viewership engagement on the majority of these platforms even that great? For TikTok, maybe. From a more sophisticated aesthetics standpoint, the unchecked race to maximize viewer engagement is a race to the bottom. Worse, from a journalistic ethics standpoint, in the realm of nonfiction, it is a race to ignorance and delusion.
In 2021, filmmaker Morgan Neville famously used AI to re-create Anthony Bourdain’s voice in the doc Roadrunner, and the move received backlash. For his part, Morgan pulled only real quotes from actual print interviews from Bourdain and used the deepfake tech “to make them come alive.” And last year, Netflix docuseries The Andy Warhol Diaries waded into similar terrain in re-creating Warhol’s voice to narrate. That type of controversy feels much less incendiary in 2023, when AI tech has advanced to make wholly fake audio, video or photos appear like real life.
There is much to say about the moving goalposts of ethics within the documentary craft these days — with or without the use of AI as a filmmaking tool. The more sinister force at play, however, and the one driving what might be considered widespread ethical breaches, is the potential ceding of human curation to algorithms and leveraging data to decide which projects to buy and even how to shape them on an act-by-act basis. Yes, there were focus groups and dial tests in the past. Yes, there was Nielsen data. But the processes behind the insights were transparent. There was human accountability. As the industry cedes more of these decisions to black-box AI, the technology ceases to be a tool to streamline development and maximize profits — it becomes the decider itself.
And I don’t think we need a Black Mirror episode to sketch the horror of this scenario, especially in documentary.
Nonfiction storytelling is what shapes our understanding of the real world. For this reason, the preservation of human-curated documentary rises in urgency above other genres. Hollywood has always attempted to balance commercialism and artistic expression, which has allowed it historically to forge its own brand of art for the masses. But now more than ever before in history, the world’s relationship with reality is at stake. The disinformation plague is already rampant on social media, and curation algorithms are largely to blame.
In order to fulfill its obligation to the truth, furthermore, nonfiction requires trust from its audience — trust rooted in transparency and integrity — and it relies exclusively on end-to-end human control to build it.
To use deepfake technology as an example, if the viewer can’t trust the veracity of the images they see, or the audio they hear, the film loses its power. Unlike in narrative film or television, if the audience cannot trust a documentary’s integrity as a work of nonfiction, it falls apart. “Joe Hunting. The Ross brothers. Jessica Beshir. These are filmmakers who are making changes with their artistry,” Mooser adds. “It will be a long time before AI can compete.”
With respect to accountability, the same can be said for the administrative level of nonfiction, the editorial role inhabited by the film executive (and increasingly shaped by AI-generated data insights). With a human at the helm, the audience can question the motives of a studio or platform for greenlighting a film — be they commercial, political or both — but the audience cannot question the motives of an algorithm showing something to an audience because it deems that content “popular.”
For Josh Braun, co-president of documentary behemoth Submarine, there is a deep-seated hunger for rule-breaking that defines us as humans, and this expresses itself in a perennial desire for the fresh and new. “This is the savior of the potential nightmare scenario. No matter how you slice it, people have visceral reactions to things. This will push the most interesting docs back into the distribution companies. The Neons. The A24s. The Magnolias. The IFCs. These are the places we’re seeing deals,” Braun says.
And the indie market could be the bulwark. “The more esoteric titles that people want will be what rejuvenates the theatrical marketplace,” Braun adds. “You won’t get the same level of choice on the algorithm-driven platforms.”
As the industry integrates AI into every aspect of the business, technology must remain a tool, not a substitute for human judgment and accountability. That’s what the Writers Guild is pushing for right now in its standoff with producers and studios. And protecting the integrity of nonfiction storytelling is paramount, as it is one of the few remaining domains where truth — and trust in a shared understanding of the world — is sacrosanct. Ultimately, whatever is done under the influence of AI must be done with a strong moral compass, guided by the principles of honesty, transparency and respect for the dignity of the people involved in the stories that are told.
But maybe it’s already too late.
Sign up for THR news straight to your inbox every day