Bryan Cranston, CAA, UTA Praise OpenAI's New "Guardrails" For Sora 2
Bryan Cranston, CAA, UTA Praise OpenAI's New "Guardrails" For Sora 2
Homepage   /    technology   /    Bryan Cranston, CAA, UTA Praise OpenAI's New "Guardrails" For Sora 2

Bryan Cranston, CAA, UTA Praise OpenAI's New "Guardrails" For Sora 2

🕒︎ 2025-10-20

Copyright Deadline

Bryan Cranston, CAA, UTA Praise OpenAI's New Guardrails For Sora 2

Bryan Cranston seems to have been won over by OpenAI after his voice and likeness were inadvertently used on Sora 2, the new iteration of its generative AI video platform, without his consent. Though he was initially troubled to find his image being used on Sora 2, the implementation of new guardrails around consent seem to have assuaged his concerns. “I was deeply concerned not just for myself, but for all performers whose work and identity can be misused in this way,” the Breaking Bad actor said in a statement via SAG-AFTRA on Monday. “I am grateful to OpenAI for its policy and for improving its guardrails, and hope that they and all of the companies involved in this work, respect our personal and professional right to manage replication of our voice and likeness.” The actors’ union said that Cranston’s voice and likeness were able to be generated in “some outputs” during the initial, invite-only launch phase of Sora 2 several weeks ago, adding that the actor himself brought the issue to the union’s attention. “While from the start it was OpenAI’s policy to require opt-in for the use of voice and likeness, OpenAI expressed regret for these unintentional generations. OpenAI has strengthened guardrails around replication of voice and likeness when individuals do not opt-in,” SAG-AFTRA said in a joint statement issued Monday along with Open AI, the Association of Talent Agents, United Talent Agency, Creative Artists Agency. The cooperation of two of the major talent agencies, CAA and UTA, is noteworthy, considering they were among the first to raise alarms about Sora 2 and the potential risks it exposed for their clients. Those agencies are now touting the “productive collaboration” with Open AI and SAG-AFTRA to protect artists’ “right to determine how and whether they can be simulated.” The fight isn’t over yet, though. In a statement of his own Monday, newly elected SAG-AFTRA President Sean Astin warned “Bryan Cranston is one of countless performers whose voice and likeness are in danger of massive misappropriation by replication technology.” “Bryan did the right thing by communicating with his union and his professional representatives to have the matter addressed. This particular case has a positive resolution. I’m glad that OpenAI has committed to using an opt-in protocol, where all artists have the ability to choose whether they wish to participate in the exploitation of their voice and likeness using A.I.,” Astin’s statement continued. “This policy must be durable and I thank all of the stakeholders, including OpenAI for working together to have the appropriate protections enshrined in law. Simply put, opt-in protocols are the only way to do business and the NO FAKES Act will make us safer.” The No FAKES Act (read it here), currently circulating in Congress, seeks to ban the production and distribution of an unauthorized AI-generated replica of an individual using their likeness or voice. It would require an individual’s express consent for such replicas. Right now, AI companies rely on “fair use” laws to protect them, and the legal framework of AI is not yet firmly established, copyright experts say. OpenAI has publicly supported the bill and continued to do so Monday. “OpenAI is deeply committed to protecting performers from the misappropriation of their voice and likeness. We were an early supporter of the NO FAKES Act when it was introduced last year, and will always stand behind the rights of performers,” Altman said in a statement. Cranston was not alone in his objections about Sora 2. Last week, the estate of Martin Luther King, Jr. and OpenAI agreed to pause images of King created by the platform. With just a brief text prompt, users have been able to show King or a wide range of others (Fred Rogers, Tupac Shakur, Kobe Bryant) in made-up settings – the wackier the better for many users. King appeared in one video shilling for Burger King. So far, many of the famous faces whose likeness were available on the platform have been deceased, though not all, as demonstrated by Cranston’s appearance in some videos.

Guess You Like

Channel 4 announces UK television's first AI generated presenter
Channel 4 announces UK television's first AI generated presenter
Channel 4 has announced that t...
2025-10-20
Neman: Book by WashU professor says tacos are more than tacos
Neman: Book by WashU professor says tacos are more than tacos
Daniel Neman | Post-Dispatch F...
2025-10-21