FindArticles FindArticles
  • News
  • Technology
  • Business
  • Entertainment
  • Science & Health
  • Knowledge Base
FindArticlesFindArticles
Font ResizerAa
Search
  • News
  • Technology
  • Business
  • Entertainment
  • Science & Health
  • Knowledge Base
Follow US
  • Contact Us
  • About Us
  • Write For Us
  • Privacy Policy
  • Terms of Service
FindArticles © 2025. All Rights Reserved.
FindArticles > News > Technology

Sora Bans Public Figure Deepfakes But Not Dead Stars

Bill Thompson
Last updated: October 28, 2025 2:13 pm
By Bill Thompson
Technology
7 Min Read
SHARE

OpenAI’s Sora is drawing a hard line on deepfakes of living members of the public, but still leaving the door open for depictions of deceased celebrities.

The invite-only app is already inundated with slick AI-created videos featuring Michael Jackson, Tupac Shakur, Bob Ross, Malcolm X, and Bruce Lee, among others, as well as a host of famous characters. Every Sora video is watermarked, and the company says that living public figures can be included by design—only with direct permission as part of a Cameo-like feature.

Table of Contents
  • What the ban covers for living public figures
  • Why depictions of dead celebrities remain an exception
  • Misinformation Risks Despite Watermarks
  • Copyright and trademark headwinds for AI-generated clips
  • How Sora can reinforce guardrails for safer creation
  • Sora’s bottom line on deepfakes, consent, and risk
The Sora logo, a black abstract, geometric, interlocking knot design, above the word S ora in black text, set against a light blue and white hexagonal

The carve-out places Sora at the center of a thorny debate: where consent, postmortem rights, and platform responsibility meet the viral attraction of hyperreal generative video.

What the ban covers for living public figures

OpenAI presents its policy against unauthorized deepfakes of living public figures as a safety issue, accompanied by a consent mechanism that it has implemented in conjunction with Cameo where people can opt in. In real terms, that means you can’t create a current politician or pop star without their signoff, but dead people and made-up IP are fair game.

Among others, users have published clips that don’t just resemble a person’s face, but also voices and mannerisms that feel uncannily plausible. Some videos feature familiar songs, suggesting a second minefield around music licensing. The result is a feed that sounds less like Miss Cleo’s magic phone and more like an anthropological time capsule—one, mind you, generated by a smart AI model in seconds.

Why depictions of dead celebrities remain an exception

Even if depictions of deceased figures are permitted once you have gained consent from the estate, this doesn’t mean that the legal terrain is a simple one. Many jurisdictions allow rights of publicity to survive death: California with its own Celebrity Rights Act and New York with postmortem protections both give estates the right to police, for commercial purposes, a dead person’s name, image, and likeness. That leaves platforms and creators on the hook if a clip seems like advertising (without necessarily appearing to be so), incorporates locked-down trademarks, or suggests an endorsement.

Estates have a track record of enforcing aggressively. Families behind legends like Michael Jackson and Marilyn Monroe have fought again and again for rights to likenesses and merchandise. If companies try to charge for Sora, or if they repurpose content from it, in brand partnerships or whatever other ways people commercialize inside the game world of Fortnite, you will see legal challenges.

Misinformation Risks Despite Watermarks

All Sora videos are watermarked now, says OpenAI—a minimum basic defense that at least is better than nothing, although it’s easy to blunt with cropping or re-encoding or just re-uploading. Researchers and digital-rights advocates, including the Electronic Frontier Foundation, have long cautioned that provenance labels alone aren’t a silver bullet without persistent metadata and cross-platform enforcement.

A family of wool ly mammoths, including a large adult and a smaller juvenile, walk through a snowy landscape with pine trees and mountains in the back

Public anxiety is already elevated. The Reuters Institute has found that most people are worried about being able to distinguish real material from AI-generated representations, a perceptual hurdle made only more daunting when hyperreal videos gain traction without warning. And the danger is more than theoretical: audio deepfakes have pretended to be political leaders and duped call recipients, driving regulators including the FCC and consumer-protection agencies to fight back against AI-powered robocalls. Video that can appear to be genuine at first glance makes the same problem worse.

Copyright and trademark headwinds for AI-generated clips

Allowing users to create Nintendo’s Mario or Nickelodeon’s SpongeBob, for instance, opens the door to classic copyright and trademark liability. Big companies with hard-liners in their IP enforcement divisions probably won’t see mass distribution of unlicensed character likenesses, human or not, as permissible—not even if you specify the character as being “AI-generated.” Watch for takedowns and possibly test cases over the point at which transformative fair use becomes infringement.

Music rights are a second layer. Recent industry moves, such as lawsuits by the Recording Industry Association of America against AI music startups, illustrate how rapidly labels act when models duplicate signature sounds or rely on catalog recordings without consent. If Sora clips include clear vocals or unlicensed tracks, there’s potential for claims from a number of angles against both creators and platforms.

How Sora can reinforce guardrails for safer creation

Three levers are the most important: consent, provenance, and enforcement. Consent should be granular and auditable, especially in matters involving public figures and estates. Provenance needs to be more than visible watermarks; it should include tamper-resistant metadata aligning with the C2PA standard sponsored by the likes of Adobe, Microsoft, the BBC, and others; plus visible labels that persist in reposts.

Policing needs to exist on both the front and back ends. That is, timely estate verification, real IP takedowns that stick across mirrors, and at least a little friction or rate limits when prompts are heading in the general direction of impersonation. Platforms like YouTube need synthetic media disclaimers and impersonation removals; VKontakte will be compared against the best such practices as they exist today, at the very least.

Sora’s bottom line on deepfakes, consent, and risk

OpenAI’s divide-and-conquer policy—ban living public figure deepfakes; permit dead celebrity replicas—values consent without completely abandoning cultural remix. But the exemption invites court challenges from estates and rights holders, and it raises the danger that casual viewers might confuse slick AI fiction and fact. If Sora aims to be the shopfront for generative video, it needs consent workflows estates trust, provenance that travels with the clip, and clear rules creators can follow without requiring them to employ a lawyer.

Bill Thompson
ByBill Thompson
Bill Thompson is a veteran technology columnist and digital culture analyst with decades of experience reporting on the intersection of media, society, and the internet. His commentary has been featured across major publications and global broadcasters. Known for exploring the social impact of digital transformation, Bill writes with a focus on ethics, innovation, and the future of information.
Latest News
How to Master Hoodie Design for a Memorable Valentine’s Date?
7 Ways Mental Health Care Supports Emotional Wellness
How Much Do Medical Alert Systems Cost In Canada- And What’s Included?
7 Situations Where Hiring Lawyer Makes Difference
Fast Phone Charger vs Regular Charger: Is It Really Faster?
Stansted Airport Taxi | Free Online Quote (Save 20% Now)
Breaking the Wall of “I Don’t Get It”: How an AI Math Solver Turns Frustration into Fluency
Save Your Presentation: How to Fix Blurry Charts, Graphs, and Screenshots for High-Stakes PowerPoint Decks
The “Pro Polish” Secret: How to Transform Amateur Snapshots into Commercial Gold with AI
Design Smarter, Not Harder: How to Customize Stock Photos and Create Mockups Instantly with AI
Beyond “Sharpening”: The Science of How AI Actually Reconstructs Your Photos
Protect Your Privacy: How to Remove Sensitive Data and Unwanted Text from Screenshots
FindArticles
  • Contact Us
  • About Us
  • Write For Us
  • Privacy Policy
  • Terms of Service
  • Corrections Policy
  • Diversity & Inclusion Statement
  • Diversity in Our Team
  • Editorial Guidelines
  • Feedback & Editorial Contact Policy
FindArticles © 2025. All Rights Reserved.