Hollywood guilds and studios are lining up against Seedance 2.0, a new ByteDance text-to-video model that insiders say makes it trivial to spin up short clips featuring famous faces and copyrighted characters. Industry groups argue the tool is already fueling large-scale infringement and trampling performers’ rights at a moment when synthetic media is accelerating across the internet.
ByteDance rolled out Seedance 2.0 to users of its Jianying editing app in China, with a global release planned through CapCut. The system generates 15-second videos from text prompts, placing it in the same class as frontier models like OpenAI’s Sora. Early demos circulating on social platforms show named celebrities and studio-owned characters appearing in plausible, stylized scenes—exactly the kind of output Hollywood has warned about for years.
Studios Decry Unchecked IP Use in AI Video Tools
The Motion Picture Association called on ByteDance to halt Seedance 2.0, saying the service enables mass, unauthorized use of protected works and undermines the legal framework that supports millions of creative jobs. The trade group’s message is blunt: without robust guardrails, text-to-video becomes a turnkey engine for derivative works that would normally require licenses.
Disney has sent a cease-and-desist letter after Seedance clips surfaced featuring Spider-Man, Darth Vader, and Grogu, according to reporting from Axios. The company framed the issue as an aggressive misappropriation of its characters—while signaling it’s not opposed to AI outright. Disney has reportedly pursued enforcement with other tech firms even as it struck a multi-year licensing deal with OpenAI, underscoring a broader point: sanctioned access is negotiable, but unlicensed replication is not.
One widely shared Seedance example depicted a stylized fight scene between lookalike versions of Tom Cruise and Brad Pitt. The clip prompted alarm from writers and filmmakers; Deadpool co-writer Rhett Reese publicly suggested that tools like this could destabilize the creative professions if left unregulated.
Actors and Unions Raise Likeness Alarms Over AI
SAG-AFTRA condemned Seedance 2.0 for enabling unauthorized digital doubles, echoing concerns that fueled last year’s contract fight over AI. Those agreements cemented consent and compensation requirements for digital replicas, but generative video systems that can conjure convincing performances from text push the boundary in ways that are hard to police at platform scale.
The Human Artistry Campaign, a coalition backed by major creative unions and trade bodies, cast Seedance 2.0 as a direct challenge to working artists. Their argument rests on two pillars: the right of publicity, which protects a person’s name, image, and likeness; and copyright, which covers original characters, scripts, and visual assets. Both are implicated when models can synthesize recognizable individuals or studio IP on demand.
How Seedance 2.0 Works And Why It Matters
Seedance 2.0 turns short prompts into coherent, 15-second motion clips with camera moves, lighting changes, and stylized art direction—capabilities popularized by high-end research models over the past year. The friction is low: the technology is embedded in mainstream editing apps already used by hundreds of millions of creators worldwide. If guardrails are lax, prompts referencing living actors or branded properties can yield outputs that look uncomfortably close to the real thing.
This scale is what scares studios. CapCut’s global footprint means any misstep is amplified, and takedown-only enforcement can’t keep pace with viral sharing. By contrast, some rivals publicly tout stricter filters around public figures, brand assets, and cinematic franchises, alongside watermarking and provenance initiatives aimed at curbing misuse.
Legal Fault Lines Around AI Video and Likeness Rights
Copyright and likeness questions are converging. U.S. copyright law prohibits unauthorized reproduction and derivative works; the right of publicity in states like California and New York restricts commercial exploitation of a person’s identity. A model that outputs a scene with a recognizable actor or a studio-owned character can trigger both regimes, regardless of whether the underlying training is ultimately deemed fair use by courts.
The U.S. Copyright Office is actively studying AI and has issued guidance requiring disclosure of AI-generated material in registrations, while multiple high-profile lawsuits against generative AI companies test how training on copyrighted datasets and producing lookalike content should be treated. Platform liability remains a live issue: DMCA safe harbors offer some protection if companies act on takedown notices, but willful design choices that encourage infringement can erode that shield.
What to Watch Next as Studios and Unions Push Back
Expect three immediate pressures on ByteDance: stricter model filters to block named people and protected characters, adoption of watermarking and content provenance standards, and licensing talks with major rights holders. Studios will likely escalate legal action if outputs mimicking marquee IP continue to spread.
For creators, the message is caution. Generating clips that evoke real actors or franchise icons can invite takedowns or legal claims, even if a tool makes it possible with a two-line prompt. For Hollywood, Seedance 2.0 isn’t just another demo—it is a test of whether the industry can secure enforceable norms before synthetic video floods the market faster than contracts and courts can respond.