Tilly Norwood is your typical 21st-century up-and-coming London actress: red-carpet shots, behind-the-scenes videos, and something like 40,000 Instagram followers. But the person audiences are meeting does not, in fact, exist. Norwood is a wired creation by Xicoia, the AI arm of production company Particle6, and was presented at Zurich Film Festival by producer Eline Van der Velden. The experiment has jolted Hollywood’s system of labor and talent, with guilds, actors, and agents all signaling that a line is being crossed.
Van der Velden has conceived of Norwood as a creative project — and has even attempted to find an agent.

The pitch is a simple one: a digital actor who, like any other, can be booked for a price. The response has been anything but straightforward, exposing fault lines over authorship, consent, and the economics of star power at a time when generative AI is coming into its own.
Why A Synthetic Actress Is Raising Eyebrows
The objections aren’t just philosophical. SAG-AFTRA has cautioned that “synthetic performers” raise fundamental contractual protections — notice, bargaining, and pay — when a digital entity is employed rather than a human performer. The union’s position is anchored in the hard-fought AI provisions of last year, which mandate consent and payment for digital replicas and aim to stop models trained on unlicensed performances from replacing members.
Actors also notice how there seems to be a slow erosion of the human connection that fuels the business. On one Variety podcast, Emily Blunt succinctly summed up the unease, calling the notion “scary” and begging agencies not to lend their stamp of approval. It’s not so much the Instagram feed of one character that concerns us, but precedent: if agencies normalize synthetic clients, negotiating leverage among human talent diminishes and training data collected from real performances becomes a competitive weapon against the very humans who produced it.
Some writers and directors worry in similar ways. The Writers Guild of America emphasized that AI is prohibited from writing or rewriting covered material, and the Directors Guild emphasized that AI cannot supplant human tasks. The through line is obvious: creative labor would like guardrails before studios and vendors roll out AI at scale for casting, performance, and promotion.
Inside the Norwood pitch and its festival debut
Norwood’s supporters are attempting to portray her as art, not a worker substitute. Van der Velden responded publicly, calling the character a conversation starter — an authored-and-curated persona. That defense mirrors how virtual influencers like Lil Miquela have been marketed in fashion and advertising for years: from brand collaborations with the likes of Spain’s Aitana Lopez, a synthetic model that is said to make thousands of dollars a month through partnerships.

But the movie theater is not a billboard. An actor’s worth isn’t just the photo-perfect feed we see: it’s all that improvisation on set, and all that chemistry with scene partners, and the actual lived experience that informs performance choices. That’s why casting directors and producers have long been willing to shell out a premium for bankable stars. It’s also why insurers and completion bond companies write language into their policies that carefully scrutinizes risks of new productions; trading a human lead for the face of a lab-made star involves an untidy knot, legal and reputational, not so easily undone.
Contracts, Law And The Reality Of Business
Even if synthetic actors do score parts, their contracts won’t be cut and dried. Studios would have to negotiate with unions when a digital stand-in is used for a human, and they’d need records of where the model’s training data originated. Without ironclad provenance and consent, productions risk falling afoul of right-of-publicity laws — designed to protect name, image, and likeness in key markets like California. State-based activity is heating up as well: Tennessee’s ELVIS Act was a modification of existing laws to protect against unauthorized cloning of voice and image, with similar bills moving through legislatures in other states.
There’s also the market test. Advertisers have proven interested in virtual influencers; HypeAuditor has cataloged hundreds and recorded competitive engagement rates. But long-form narrative is rougher going. Audiences will accept digital doubles for stunts and de-aging so long as there’s a human performer to stare back at the screens. They have yet to fully commit to feature-length stories driven by characters that lack the human core beneath the pixels.
Technology is racing ahead. OpenAI’s newest text-to-video advances, combined with fast gains by research labs and startups, bring photoreal performance synthesis closer every quarter. McKinsey has estimated generative AI will generate trillions in annual economic value around the world, and entertainment will grab some of that via localization, VFX workflows, and marketing automation. But none of that means that fans will shell out to see an AI headliner.
What Hollywood does next with synthetic performers
Unions are already drafting policies about synthetic clients, guilds are sharpening enforcement language, and studios are exploring hybrid uses that keep human performances at the core — think digital body doubles under actor control or AI tools for ADR, dubbing, and background activity with consent and compensation. In that world, AI is an enabling layer, not a replacement.
Tilly Norwood might endure as an Instagram-age curiosity — a test case that helps define where the lines fall. But the larger takeaway from Hollywood’s early reaction is clear: until provenance, permission, and pay are all completely worked out — and until audiences themselves demand AI-led stories — the industry doesn’t have much appetite for a synthetic star leaping from feed to feature.
