The US Copyright Office has published a recommending new and improved protections against digital replicas. “We have concluded that a new law is needed,” the department’s report states. “The speed, precision, and scale of AI-created digital replicas calls for prompt federal action. Without a robust nationwide remedy, their unauthorized publication and distribution threaten substantial harm not only in the entertainment and political arenas, but also for private individuals.”
The Copyright Office’s assessment reveals several areas where current laws fall short of addressing digital replicas. It describes the state level as “a patchwork of protections, with the availability of a remedy dependent on where the affected individual lives or where the unauthorized use occurred.” Likewise, “existing federal laws are too narrowly drawn to fully address the harm from today’s sophisticated digital replicas.”
Among the report’s recommendations are safe harbor provisions to encourage online service providers to quickly remove unauthorized digital replicas. It also notes that “everyone has a legitimate interest in controlling the use of their likenesses, and harms such as blackmail, bullying, defamation, and use in pornography are not suffered only by celebrities,” meaning laws should cover all individuals and not just the famous ones.
The timing of this publication is fitting, considering that the Senate has been making notable moves this month to enact new legal structures around the use of digital replications and AI-generated copycats. Last week, the legislators passed to offer recourse for victims of sexual deepfakes. Today saw the introduction of to more broadly allow any individual to sue for damages for unauthorized use of their voice or likeness.
Today’s analysis is the first in several parts of the Copyright Office’s investigation into AI. With plenty more questions to explore around the use of AI in art and communication, the agency’s ongoing findings should prove insightful. Hopefully legislators and courts alike will continue to take them seriously.