By: Alan Clarke, Esq. , Peter Stathopoulos, Esq., and Brittany Waserstein-Riemer, Esq.
A Georgia bill that would create restrictions on the unauthorized commercial exploitation of an individual’s voice or likeness through the use of AI (introduced as HB 566 and entitled the “NO FAKES Act of 2025”, or the “Bill”) failed to pass the Georgia House by crossover day (March 6, 2025). Accordingly, the Bill will not advance further in the 2025 legislative session. However, the Bill will still be eligible for advancement in the 2026 legislative session.
Pursuant to the language of the Bill, it is intended i) to protect intellectual property rights in the voice and visual likeness of individuals; 2) to create a right of use for the voice or likeness of an individual in a digital replica, 3) to provide for the licensing and transferability of such right, 4) to create civil penalties for the unauthorized use of such a digital replica, and for other related purposes.
The introduction of the Bill in Georgia followed the passage of similar legislation in Tennessee in 2024 (the “Ensuring Likeness Voice and Image Security” or “ELVIS Act” which expanded the state’s existing right of publicity law to prohibit the use of AI to replicate a person’s voice or likeness without consent) and the introduction of similar federal legislation (the “NO FAKES Act of 2024 introduced as S. 4875). Similar legislation has also been introduced in other states such as Illinois (amended its Right of Publicity Statute to address AI), California (AB 2602 and AB 1836 would protect against the misuse of digital replicas), and Kentucky (SB 317 much like the ELVIS Act would regulate the authorized commercial use of an individual’s likeness through AI-generated content) as well as other states (e.g., Idaho, Indiana, New Mexico, Michigan, Texas, Minnesota, Washington State, Oregon, Utah and Wisconsin) which have adopted legislation regulating AI in political advertisements.
The Bill provides comprehensive legal protections, including a new licensable and transferable property right in an individual’s voice and likeness. The Bill is intended to address concerns regarding the use of Artificial Intelligence (“AI”) in creating, distributing, or using digital replicas of a person’s likeness or voice for commercial purposes without prior consent. However, the proposed law aims to balance First Amendment rights by providing an exception for fair use, including journalistic reporting, parody, and other protected speech similar to the federal NO FAKES Act and the Digital Millenium Copyright Act, which would allow and allows, respectively, online platforms to avoid liability if they promptly remove unauthorized digital replicas upon receiving a takedown notice. Any individual, or their estate, would have standing to sue for damages if their voice or likeness is misappropriated. However, the Federal NO FAKES Act and the Georgia NO FAKES Act differ in that the Georgia NO FAKES Act explicitly addresses how AI-generated content may be used commercially.
With the rise of technological advances such as AI, federal copyright law, NIL laws (i.e., Name, Image, and Likeness), and entertainment industry contracts—including SAG-AFTRA and WGA collective bargaining agreements—have become increasingly interconnected. Traditionally, the right of publicity allows individuals to control and profit from the commercial use of their name, image, and likeness (i.e., NIL rights). The Georgia NO FAKES Act extends the foregoing concept to live performances, audiovisual works, and digital replicas, ensuring that unauthorized AI-generated replicas are encompassed within these protections, unlike the federal NO FAKES Act, which provides for protections only in audiovisual or sound recordings. NIL rights have been governed by state law, which protections vary across different jurisdictions, with some jurisdictions not providing any protections at all.
On the other hand, federal copyright law primarily protects original works of authorship, such as films, music, and writings. It does not adequately protect an individual’s voice or likeness, perhaps unless it is embedded in a copyrightable work. This fragmented and complex legal landscape creates challenges for individuals and businesses operating across various states in the United States. SAG-AFTRA, an American labor union that represents more than 16,000 actors and performers, has been a strong advocate for the NO FAKES Act at the federal level, which is similar to the Georgia NO FAKES Act, pushing for protections to regulate AI and protect artists from exploitation of their image and likeness. The foregoing efforts have led SAG-AFTRA to enter into agreements with AI technology companies and include key provisions in SAG-AFTRA contracts to address consent and compensation for digital replicas, restrictions on AI voice and likeness use, protections against unauthorized AI replication and safeguards against job replacement.
Notably, the Georgia NO FAKES Act would modernize Georgia’s legal framework to address unique challenges posed by AI and similar technological advantages by implementing robust protections for individuals against the unauthorized use of their voice and likeness creating individual autonomy and promoting innovation while ensuring an individual’s privacy and consent is respected, particularly as technology continues to rapidly evolve.
The Bill has been the subject of criticism. On a broad level, this includes concerns about creating a patchwork of differing state laws that may conflict with federal law and concerns with unduly limiting free expression. Another concern is that, like the ELVIS Act, the Bill does not appear to limit claimants to those whose rights have been injured in that particular state and could lead to litigation from claimants all over the country. Groups such as the Computer and Communications Industry Association argue that liability should be limited to those who knowingly violate an individual’s intellectual property rights and should not extend to service providers. See the February 28, 2025, letter of Tom Mann, State Policy Manager, South, Computer & Communications Industry Association in opposition to HB 566.
Further, there is concern over the damages set forth in the Bill. The Bill would enable damages of the greater of actual damages plus profits or statutory damages of $5,000 per “work embodying the applicable unauthorized digital replica” for an individual or “online service,” $25,000 per work for “an entity that is not an online service,” the opportunity to seek injunctive or other equitable relief, punitive damages if willful, and reasonable attorney’s fees. Other opposition includes the fact that there is no counter-notice provision for someone whose content is subject to a takedown notice, or the right to appeal a takedown.
As is often the case, the relationship between state and federal laws is a complex one, and if both the federal NO FAKES Act and the Bill are enacted, the Bill may face preemption issues. The Preemption Doctrine, which comes from the supremacy clause in Article VI of the U.S. Constitution generally states that when a federal law conflicts with a state law, the federal law prevails. This means that the Georgia NO FAKES Act would likely be unenforceable in areas where it conflicts with federal law. However, Georgia’s NO FAKES Act seems to be more expansive and would therefore survive partial preemption (i.e., only parts of it would be overridden if they directly conflict with federal provisions). Proponents of the Georgia NO FAKES Act are presumably hoping that Congress intends the federal NO FAKES Act to be the floor for minimum protection against unauthorized AI-generated content in order to preserve states to offer individuals additional protections.