OpenAI And SAG-AFTRA Strengthen Voice And Likeness Protections In Sora 2 With Support For NO FAKES Act


In Brief
OpenAI, in collaboration with SAG-AFTRA, Bryan Cranston, and talent agencies, has strengthened opt-in protections for voice and likeness in Sora 2 to ensure ethical AI use, while supporting the NO FAKES Act amid ongoing legal uncertainties.

Artificial intelligence research organization OpenAI issued a joint statement with SAG-AFTRA, the labor union representing approximately 160,000 media professionals, along with actor Bryan Cranston, United Talent Agency, Creative Artists Agency, and the Association of Talent Agents, addressing collaborative efforts to ensure protections for voice and likeness in Sora 2.
The statement noted that during the initial invite-only release of Sora 2 two weeks ago, some outputs were able to generate Bryan Cranston’s voice and likeness without consent or compensation. While OpenAI’s policy has always required opt-in for the use of voice and likeness, the company acknowledged and expressed regret for these unintended occurrences. OpenAI has since implemented enhanced safeguards to prevent the replication of voice and likeness for individuals who have not opted in.
“I was deeply concerned not just for myself, but for all performers whose work and identity can be misused in this way,” said Bryan Cranston, commenting on the situation. “I am grateful to OpenAI for its policy and for improving its guardrails, and hope that they and all of the companies involved in this work respect our personal and professional right to manage replication of our voice and likeness,” he added.
OpenAI maintains an opt-in policy for the use of an individual’s voice or likeness within Sora 2, allowing artists, performers, and other individuals to control if and how their identities may be simulated. This approach demonstrates the organization’s commitment to protecting creator rights, ensuring transparency, and promoting the responsible application of generative technology. The company has also pledged to address any complaints promptly.
NO FAKES Act To Protect Performers’ Voices And Likenesses
This new framework is consistent with the objectives of the NO FAKES Act, pending federal legislation aimed at safeguarding performers and the public from unauthorized digital replication. OpenAI, SAG-AFTRA, Bryan Cranston, and his representatives at United Talent Agency, the Association of Talent Agents, and Creative Artists Agency share a unified stance in support of the NO FAKES Act and its aim to create a national standard that prevents the use of performers’ voices and likenesses without permission. Collectively, they emphasize that consent and compensation are essential for fostering a sustainable and ethical creative ecosystem in both entertainment and technology.
“Bryan Cranston is one of countless performers whose voice and likeness are in danger of massive misappropriation by replication technology. Bryan did the right thing by communicating with his union and his professional representatives to have the matter addressed. This particular case has a positive resolution,” said SAG-AFTRA President Sean Astin. “I’m glad that OpenAI has committed to using an opt-in protocol, where all artists have the ability to choose whether they wish to participate in the exploitation of their voice and likeness using A.I. This policy must be durable, and I thank all of the stakeholders, including OpenAI, for working together to have the appropriate protections enshrined in law. Simply put, opt-in protocols are the only way to do business, and the NO FAKES Act will make us safer,” he added.
“OpenAI is deeply committed to protecting performers from the misappropriation of their voice and likeness,” said Sam Altman, CEO of OpenAI. “We were an early supporter of the NO FAKES Act when it was introduced last year, and will always stand behind the rights of performers,” he added.
OpenAI Navigates Concerns Over AI-Generated Likenesses Of Public Figures
OpenAI’s Sora 2, which allows users to generate AI-driven videos featuring the likenesses of public figures, has drawn significant attention since its launch. While some celebrities, such as Mark Cuban, have opted in to participate, the platform has faced criticism for enabling the creation of deepfakes of historical figures like Martin Luther King Jr. and actor Bryan Cranston without the consent of their estates. This has prompted backlash from the families of those depicted and raised concerns about the ethical and legal implications of using AI to replicate individuals’ likenesses without permission.
In response, OpenAI has paused the generation of videos featuring Martin Luther King Jr. and is working with his estate to establish guidelines for the use of his likeness. Additionally, the company has implemented an opt-in policy for the use of individuals’ voices and likenesses in Sora 2, allowing artists, performers, and other individuals to control if and how their identities may be simulated. This policy reflects OpenAI’s commitment to protecting creator rights, ensuring transparency, and promoting the responsible deployment of generative technology.
Much of the platform’s attention, however, has been fueled by a legal landscape surrounding AI and digital likenesses that remains unclear and underdeveloped.
Disclaimer
In line with the Trust Project guidelines, please note that the information provided on this page is not intended to be and should not be interpreted as legal, tax, investment, financial, or any other form of advice. It is important to only invest what you can afford to lose and to seek independent financial advice if you have any doubts. For further information, we suggest referring to the terms and conditions as well as the help and support pages provided by the issuer or advertiser. MetaversePost is committed to accurate, unbiased reporting, but market conditions are subject to change without notice.
About The Author
Alisa, a dedicated journalist at the MPost, specializes in cryptocurrency, zero-knowledge proofs, investments, and the expansive realm of Web3. With a keen eye for emerging trends and technologies, she delivers comprehensive coverage to inform and engage readers in the ever-evolving landscape of digital finance.
More articles

Alisa, a dedicated journalist at the MPost, specializes in cryptocurrency, zero-knowledge proofs, investments, and the expansive realm of Web3. With a keen eye for emerging trends and technologies, she delivers comprehensive coverage to inform and engage readers in the ever-evolving landscape of digital finance.