Great documentaries reveal history’s truth. Unregulated AI threatens to distort it

A furious political leader shouting a message of hate to an adoring audience. A child crying over the massacre of her family. Emaciated men in prison uniforms, starved to the edge of death because of their identities. As you read each sentence, specific imagery likely appears in your mind, seared in your memory and our collective consciousness through documentaries and textbooks, news media and museum visits.
We understand the significance of important historical images like these — images that we must learn from in order to move forward — in large part because they captured something true about the world when we weren’t around to see it with our own eyes.
As archival producers for documentary films and co-directors of the Archival Producers Alliance, we are deeply concerned about what could happen when we can no longer trust that such images reflect reality. And we’re not the only ones: In advance of this year’s Oscars, Variety reported that the Motion Picture Academy is considering requiring contenders to disclose the use of generative AI.
While such disclosure may be important for feature films, it is clearly crucial for documentaries. In the spring of 2023, we began to see synthetic images and audio used in the historical documentaries we were working on. With no standards in place for transparency, we fear this commingling of real and unreal could compromise the nonfiction genre and the indispensable role it plays in our shared history.
In February 2024, OpenAI previewed its new text-to-video platform, Sora, with a clip called “Historical footage of California during the Gold Rush.” The video was convincing: A flowing stream filled with the promise of riches. A blue sky and rolling hills. A thriving town. Men on horseback. It looked like a western where the good guy wins and rides off into the sunset. It looked authentic, but it was fake.
OpenAI presented “Historical Footage of California During the Gold Rush” to demonstrate how Sora, officially released in December 2024, creates videos based on user prompts using AI that “understands and simulates reality.” But that clip is not reality. It is a haphazard blend of imagery both real and imagined by Hollywood, along with the industry’s and archives’ historical biases. Sora, like other generative AI programs such as Runway and Luma Dream Machine, scrapes content from the internet and other digital material. As a result, these platforms are simply recycling the limitations of online media, and no doubt amplifying the biases. Yet watching it, we understand how an audience might be fooled. Cinema is powerful that way.
Some in the film world have met the arrival of generative AI tools with open arms. We and others see it as something deeply troubling on the horizon. If our faith in the veracity of visuals is shattered, powerful and important films could lose their claim on the truth, even if they don’t use AI-generated material.
Read More: Great documentaries reveal history’s truth. Unregulated AI threatens to distort it
Advertising by Adpathway




