Skip to main content

The Dilemma of Documentary Proof: Archival Producers Alliance Co-founders Discuss How AI Will Change Archival

By Williams Cole

An interview with two organizers of the Archival Producers Alliance on how AI will shake up the nature of archival footage

For the first decades of this century there was, relatively speaking, intermittent media coverage of Artificial Intelligence or how deep learning and related technologies were creeping into aspects of work, culture, and society. But then in late 2022, OpenAI released the large language model ChatGPT to the public—and stories about AI exploded. Subsequently, the coverage around AI in the last year or so has reached a point of fatigue, even satire: “It’s AI” has become a catchphrase for anything, especially visual, that seems exaggerated or unbelievable. Recently, it seems every week there are new stories about instances of synthetic media and how a deluge of deepfakes will be coming in this year’s presidential election.

While audiences might be overwhelmed, confused, or just plain tired of hearing about AI, billions of dollars are being invested by tech behemoths and venture capitalists, and new generative models with exponential power are being developed, tested, and released to the public. This is especially true with, as it turns out, something AI models have come to excel at: the creation of synthetic imagery. In mid-February, for example, OpenAI previewed its new “text-to-video” model Sora, a tool that can create from text prompts complex, high-definition moving imagery clips up to a minute long. And while the AI craze is perhaps not in the headlines as much as it was in 2023, for documentary filmmaking it is no longer a question if or when AI will affect them, but how transformative it will come to be. 

For many, documentary is rooted in an idealism that covets a fragile responsibility, truth to power, and authenticity around historical narrative (though all these claims come with age-old caveats). And one essential part of the documentary filmmaking process is the use of archival imagery. Archival research and producing have been a part of the craft since the early days and are integral to the production process, especially when engaging in historical subjects like classic broadcast series Eyes on the Prize and The Vietnam War, the recent hit Fire of Love, and many others. 

But with the advent and rapid development of AI-based synthetic imagery tools, the tried-and-tested practices in archival producing are being challenged. Directors and producers are beginning to consider the use of synthetically generated content in place of the intensive skilled research, complicated rights issues, and increasing expense of archival producing—and this is opening a Pandora’s box of issues regarding documentary proof, authenticity, and ethics. Formed in 2023, the Archival Producers Alliance (APA), a group of more than 200 archival producers, published an open letter in The Hollywood Reporter last year to bring attention to the use of AI in documentary film. Two principal organizers of the APA, Rachel Antell and Jennifer Petrucelli, spoke with Documentary via a bicoastal Zoom. The conversation, edited here for length and clarity, touches on the state of the craft of archival producing, the virtues (and challenges) of archival research, and the kinds of guardrails that should be developed sooner rather than later in this dawning “Age of AI.” This discussion is also a preview of the reasoning behind why the APA will present a rough draft of guidelines for ethical AI use in documentary filmmaking at Getting Real ’24.


DOCUMENTARY: As experienced archival producers, how and when did you first start to notice the idea of AI creeping into the workflow?

RACHEL ANTELL: Earlier last year, we were working on a film, and some of the archival visuals that the filmmakers needed to make the film either took place before cameras were invented or some of it existed, but there was not a lot of it. ChatGPT was in the news quite a bit at that point. We were having a conversation before a production meeting like, “Oh boy, do you think in the next couple of years we’ll actually see this entering the doc space?” Lo and behold, the next day we were in the meeting and the producers were like, “Look at this! We can use AI to make this!” We had never encountered this use of AI before, and the first concern raised for us was how this was going to be labeled. How will the audience know? The producers didn’t have answers to those questions. And, honestly, ten months later it’s still a situation where nobody knows. It’s like the Wild West. The requirements that filmmakers should follow are not ethically or legally clear. And there is not much guidance. When we asked the producers these questions, they would say they were talking to lawyers and trying to figure it out but didn’t really know. And this made us nervous, because there would obviously be these AI-generated images mixed in with and up against actual historical archival. So that started us down this path. 

JENNIFER PETRUCELLI: There are certainly good reasons to use AI. Films like Welcome to Chechnya (2020) and Another Body (2023) were very intentional in the use of AI to protect the identity of the subjects. But within our own little niche of the documentary world, we suddenly felt that there must be guidelines and standards around AI that we can agree to as a community. When you sit down to watch a documentary, I believe there’s a trust that the audience is putting in the filmmaker that what they’re seeing is real. And there’s an understanding when you sit down and watch a feature film, even one that’s based on a true story, that there are liberties that are taken. So, we really want to protect the integrity of documentaries.

D: I know it can be a complicated question, but how would you describe that “unwritten rule”—that integrity—in documentary, such as when searching for archival visual representation you are always striving for material that is the most accurate?

JP: I think that when you’re looking at archival material in a documentary, you are making assumptions when it’s edited into a film that it’s authentic. Certainly, people use re-creations, but they are usually set in an obvious stylistic cue, so the audience knows—and it is important to know if what you are seeing is real or not. We could drill down into that question of if there is “truth” and what that means. But I do think that most audiences of documentaries sit down trusting that what the filmmakers are showing them is truth through their lens—and that the visual material they are seeing is authentic. 

RA: And as archival producers, we are often the gatekeepers for that authenticity, because we’re the ones who are digging into the archives and presenting the material to the director, to the editor, and saying, yes, this is this time period. Yes, this is this place. Yes, this is this specific moment in time. And that’s a big part of our job because we want to be as accurate and as specific to what something looked like, what something felt like, so that they can feel confident showing that to the audience and knowing that that is an accurate representation of what it is they’re trying to depict. Archival is a big piece of that.

D: If we look at the capabilities of these new generative AI tools like OpenAI’s Sora, for example, which will be able to create historical archival footage of any kind—what are the consequences of these developments on that documentary proof? 

RA: I think it’s going to completely shake people’s faith in anything they see. Perhaps that is the shakeup that we need, to take a step back from trusting any media, I don’t know. But in terms of documentary, it’s chilling because if we lose that implicit trust, then the form sort of loses its value altogether. 

JP: At the same time, there are reasons to use it. There are groups of people whose histories are not represented in the archives. Our concern is making sure that there’s transparency in the use so there’s a common understanding of what we are looking at and where it came from. Essentially, we just want to have some agreed-upon language around the use of AI, so people know what they’re seeing.

RA: This is something we have to think about now. While it is currently a trickle, it will become exponential. I can’t even imagine how many images are going to be AI-created just a year from now. I think it will feel like a deluge. And so, it will become increasingly difficult to discern—even within a documentary—what is AI and what is really archival. 

D: What are some of the other solutions, like guidelines, that you think should be implemented moving forward? I know it’s still a Wild West situation, but especially from the APA’s perspective, what are some of the first steps that need to happen?

RA: We’re speaking with people in England at PACT, which is their producers’ guild, and one of the things they are doing is creating a checklist of different AI companies and vetting them in several categories regarding ethical use of training data and other criteria. It is a start, because filmmakers by and large don’t know how and from what these images are being created. Another step might be doing something near the end of the documentary production process like a fair use review—but this would be an AI review, where someone signs off on every piece of synthetically generated material—how it is used, labeled, and if it comes from a verified company. Something along those lines that could be integrated into the documentary pipeline. 

JP:  But there is still a question as to whose responsibility is that going to be? Is that going to fall to the archival producers? So, it’s really in the weeds on who is responsible for this spreadsheet and getting it to the lawyers and making sure you’re covered from an insurance perspective. 

RA: And we are going to need buy-in and partnership from the streamers and maybe the grant funders and others. Potentially awards organizations also. We’ve talked about the ways that they can influence things. Maybe you won’t be eligible for an Emmy or Oscar unless you meet these requirements, for example. But we are going to need more obvious buy-in. These ideas are all being batted around. One of the discussions at an upcoming meeting is how do we define a documentary? What is a documentary? That’s been a perennial question, and it is hard to assume that a documentary will look the same once the AI floodgates open.

D: If there are no guidelines and no buy-in, what do you see happening in terms of your own experience with crushed budgets and desperate producers wanting to create content? What is the worst-case scenario?

JP: Well, I certainly think there will be no more archival producers if we go down into the worst-case scenario. And that is part of our organization’s concern—people are worried about their jobs as well. Archival producers tend to be people who are very meticulous and very serious about the work and really bring a lot of passion to it. And I think that could go away. If the documentary film world just decides we’re not going to value archival anymore, it’s going to change the whole nature of what a documentary is. If that happens, I’m not sure what the point of a documentary is.

Williams Cole is a longtime producer, director, and editor who also researched issues in documentary as a Fulbright Scholar at the London School of Economics. He was a founder of The Brooklyn Railwhere he wrote extensively on documentary film. He has recently engaged in earning certificates in AI Strategy and Information Warfare.