Skip to main content

Legal FAQ: AI Tips for U.S. Documentary Filmmakers

By Dale Nelson and Victoria Rosales


A line drawing of a woman holding a magnifying glass up to some documents, including one that reads "AI."

It seems like the world wakes up to new information about artificial intelligence technology every day. From the lawsuits over the use of unlicensed works to train AI models, to the controversies that arise from late actors’ resurrections in Hollywood’s biggest franchises, it’s easy to see how the use of artificial intelligence can be daunting to filmmakers. And that’s before you even consider that these legal and ethical conundrums have not yet been substantively or consistently addressed by our courts or legislative systems. To help filmmakers navigate through the tricky waters of AI, our firm has compiled a few items of practical advice that we believe can be helpful to documentary filmmakers who want to utilize AI in a cautious yet effective manner. And although these tips are based on U.S. law, we’re hopeful that their practical nature will benefit filmmakers in international jurisdictions as well. 

TIP #1: If you’re using generative AI, look for models trained on licensed materials.

By now, you’ve probably heard all about the many lawsuits against video and image generation AI companies such as OpenAI, Midjourney, and DeviantArt. In short, some copyright holders are alleging that these companies are liable for copyright infringement because they have used the rightsholders’ protected works without permission or payment to train their AI models. In response, most of the companies are arguing that the use of copyrighted materials to train AI models is a transformative use of the works and thus protected under the fair use doctrine. Whether or not the courts agree is yet to be seen. Although a Delaware district court judge recently determined that the use of copyrighted material to train the AI model at the center of Thomson Reuters v. ROSS Intelligence was not fair use, that model was not generative, and the case is currently under appeal.

Until the courts sort out this complicated issue, what are filmmakers’ options if they want to use AI-generated material? Instead of looking to AI models that are trained on unlicensed material, our firm generally advises clients to use AI models trained on licensed materials. For example, companies like Adobe and Getty offer generative AI models that they claim are trained only on licensed material or material that is not protected by copyright. This protection mitigates any potential risk that a copyright holder will come forward and claim that the AI-generated material you put in a film infringes on their copyright. As an added layer of protection, many of these companies also offer indemnification for their users, meaning that the companies will cover the legal costs of a user who faces a copyright infringement claim based on the generated material. 

TIP #2: If you use AI in your film, the AI elements won’t be covered by your copyright. 

If you’ve created a work such as a song or a piece of artwork using generative AI and incorporated it into your film, can you claim copyright in that element? 

For the answer to this, we are guided by a 2023 document issued by the Copyright Office, titled “Copyright Registration Guidance: Works Containing Material Generated by Artificial Intelligence,” and a couple of Copyright Office decisions that have addressed this issue.  According to the Office, the Copyright Act requires that there be some “human authorship” for a work to be copyrightable, and in the Office’s view, while that requirement is absent from wholly AI-generated works, it could potentially be present in works that merely contain AI-generated material. In other words, while a lack of human authorship renders AI-generated material not copyrightable, works that merely contain or incorporate AI-generated material may have copyrightable aspects.

As an example of this dichotomy, consider the graphic novel Zarya and the Dawn. Artwork within the book was created using a generative AI model, but this was not initially disclosed to the Office when the application for registration was filed. The Copyright Office approved the book’s registration in 2022 but subsequently initiated an investigation after learning that artificial intelligence may have been involved in the book’s creation. Following the investigation, the Office concluded that the creator’s input of a text prompt into Midjourney wasn’t enough to establish “human authorship” over the images generated by Midjourney and included in the book. However, the Office did grant copyright protection to the author’s wholly original text within the book and the creative selection, coordination, and arrangement of the text and the Midjourney-generated images.

Since then, the United States District Court for the District of Columbia has affirmed the Copyright Office’s analysis. In Thaler v. Perlmutter, a plaintiff filed a lawsuit after his application to register a piece of visual art generated by his AI model, “Creativity Machine,” was denied by the Copyright Office. The District Court agreed with the Copyright Office’s decision and held that because the art was not the product of human authorship, it was not protected by copyright. That decision was upheld on appeal.

Taking guidance from these decisions and the Copyright Office’s policy, our advice to filmmakers is to disclose any material AI-generated elements contained in your film when filing an application to register copyright in the film, in the same way that preexisting materials such as photographs, video footage, and music are identified and expressly excluded from copyright protection in an application. As in the Zarya and the Dawn example above, this likely won’t affect the film’s overall copyright registration, and filmmakers will still be able to reap the benefits of such registration (such as an ability to file infringement lawsuits and seek statutory damages) for the film as a whole. However, filmmakers should always take care to inform distributors or any exhibitors of the film about the inclusion of AI-generated material and the exclusion of such material from copyright protection. Filmmakers should also be aware that, because the AI-generated material won’t be protected by copyright, it will be difficult to police any piracy of that specific material online or otherwise.

TIP #3: If you want to create “digital replicas” for your film, check out the relevant state laws first.

The re-creation of actual events and people through reenactments has been a long-standing fixture in documentary filmmaking. As AI technology advances, the possibility of using artificially generated “digital replicas” in lieu of actors may be increasingly appealing to some filmmakers. Setting aside any ethical issues, is it legal? The answer is, most likely yes, but the fledgling and emerging right of publicity laws should be consulted to be sure your digital replica use is covered by an exemption or otherwise falls outside the prohibitions in the statute. If your film is a SAG-AFTRA production, guild rules will also apply.

It is nearly certain that digital replica and voice simulation laws will soon be widespread. However, to date, only New York, Louisiana, Tennessee, and California have enacted laws that address the use of this technology, and each has nuances that impact their applicability to certain individuals. For example, California’s law is only applicable to deceased individuals for 70 years following their death. In contrast, Tennessee’s law is applicable during an individual’s lifetime and for a period of 10 years following their death. New York and Louisiana’s laws are only applicable to professional performers, while California and Tennessee’s laws are applicable to all individuals. In all cases, the applicable statute will likely be the law of the state where the individual being replicated resides, if living, or where the individual resided when they died, in the case of deceased persons.

Therefore, when considering using digital replicas that might be affected by one of these state laws, you should ask yourself questions such as: Is the relevant law applicable to live persons, deceased persons, or both? If the to-be-replicated individual is deceased, has the term of protection expired? Is or was the person that you want to replicate a professional performer? Depending on your subject, they might not be subject to a digital replica prohibition at all.

Moreover, all of the statutes provide for First Amendment–protected uses of digital replicas. For example, the California statutes have express exemptions that allow for the unauthorized use of digital replicas in a number of instances, including in connection with matters of public interest such as news, public affairs, or sports broadcasts. California also allows for the authorized use of replicas for purposes of comment, criticism, scholarship, satire, or parody, or when the use is fleeting or incidental. Perhaps most notably for purposes of this publication, California permits the use of digital replicas as a representation of the individual as themself in a documentary or in a historical or biographical manner, including some degree of fictionalization, unless the use is intended to create, and does create, the false impression that the work is an authentic recording in which the individual participated.

Although there is not currently a federal right of publicity, Congress also seems keen to enact a federal law that addresses digital replicas. Both the proposed “No AI FRAUD Act” and “NO FAKES Act” would create a property right in the voice and likeness of individuals and inhibit the utilization of digital replicas without the permission of the depicted individual. It is likely that these laws will also provide free speech protections.

Between state and federal legislation, new guild regulations, and external pressure from public criticism, the AI and entertainment space is in a constant state of flux. However, we are hopeful that these simple tips will provide documentary filmmakers with guidelines that allow them to incorporate this new technology to tell important stories while simultaneously reducing potential risk.


Dale Nelson is a partner at Donaldson Callif Perez LLP, where she represents independent filmmakers, documentarians, podcasters, and artists. Her practice is particularly focused on laws relating to copyright, trademark, rights of publicity, moral rights, free speech, and fair use.

Victoria Rosales is an associate at Donaldson Callif Perez LLP, handling a variety of legal issues for independent and documentary productions. As a lifelong lover of the arts, Victoria dedicates her practice to helping creators shape and execute their vision.