Clair Obscur: Expedition 33 had two Indie Game Awards honors pulled after organizers discovered the developer used generative AI assets during development. The download of this one was brought to you by the intellectual property theft lobby. IGA’s Nomination Committee retracted both the Game of the Year and Debut Game awards, reassigning them to the first runners-up: Blue Prince for Game of the Year and Sorry We’re Closed for Debut Game.
The reversal represents a rare post-award disqualification at a time when the industry is grappling with how — and whether — to use AI in commercial game art.

That’s particularly jarring because Clair Obscur has been acclaimed all year, notably winning marquee awards at major shows and “best of the year” mentions.
Why the Award Was Withdrawn by Indie Game Awards
Under its rules, the Indie Game Awards state that any title “developed using generative AI” is ineligible for submission. (A studio representative had previously reassured organizers that no generative AI was used on Clair Obscur.) But a producer said in an interview with El País that some AI, albeit not much, was involved, and reports have suggested that when the game shipped it had textures created through AI, which were then patched out.
The IGA has no problem returning the money as soon as assets change, even if they are replaced long after launch. The IGA’s public FAQ points out that replacing assets after launch doesn’t restore eligibility. In other words: provenance counts; if generative AI played any role in shipped content, the game is disqualified from awards eligibility, even if offending assets are scrubbed after the fact.
A Flashpoint for AI in Game Art and Awards Policy
This episode reflects a larger reckoning in games. Big and small studios are experimenting with generative tools for concepting, textures, and marketing art while audiences like you, me, and artists alike want clear boundaries and consent. Now the International Game Developers Association (IGDA) is calling for studios to express obviousness and licensing boundaries, making a point to cite transparency with teams as well as consumers.
Platform policies are changing, too. Valve updated its Steam guidelines to force developers to annotate AI-generated content and confirm they have the rights to the underlying training data. Disclosure requirements like these are starting to spring up in game jams, festivals, and art contests that have embraced “no gen-AI” clauses to help safeguard human-produced work and prevent copyright murkiness.
Industry surveys, such as the annual State of the Game Industry report from GDC, reflect growing experimentation with generative tools and broad legal and ethical concerns among developers. The issue is not so much whether AI exists in the pipeline — turns out it does — but whether its use is transparent, appropriately licensed, and consistent with community norms.

Transparency Is Table Stakes for Awards Eligibility Now
If there is a lesson for studios, it is that disclosure can’t be an afterthought. Awards bodies and festivals are tightening eligibility rules, and juries are questioning asset provenance more sharply. Mumbling about “patching out” contested content is not going to meet the kind of criteria that were put in place to recognize human-created art under stringent circumstances.
That the IGA finally did so is, in itself, an achievement, but also a disappointment — and for organizations staging awards shows, its action could be interpreted as marking a new posture: investigate; verify; have your rules mean something, and be prepared to reverse course when they are broken. That’s a posture that preserves trust in the selection process, but also requires predictable rules — clear definitions of “generative AI,” explicit guidance on acceptable workflows, and procedures for auditing submissions without chilling real innovation.
Winners Disqualified, Then What Happens Next?
With the disqualification, Blue Prince is now picking up IGA’s Game of the Year, while Sorry We’re Closed collects Debut Game. The developer of Clair Obscur: Expedition 33, Sandfall Interactive, has not publicly outlined exactly how much, or the extent to which, it utilized AI; prior comments attributed in press reports have floated about, while other questions were left unanswered for both fans and peers.
Clair Obscur’s wider awards run — which includes top honors at high-profile ceremonies this year — is indicative of just how scattered AI policies are across institutions. With no industry consensus on common definitions and verification practices, developers can be subjected to drastically different outcomes at different venues for the same body of work.
The Broader Picture for Creators and Players
The debate isn’t merely philosophical. It involves employment, credit, and consumers’ sense of trust. Artists want to be certain their work cannot be undercut by unlicensed training sets; players want to know just what it is they are buying; organizers desire level playing fields. Clear labeling, documented pipelines, and third-party audits are likely to shift from “nice to have” to term sheet boilerplate.
For now, the IGA ruling sends one clear message: if generative AI ever touched the produced assets, no eligibility for you. Studios vying for awards — and the reputational boost that accompanies them — will need to bake transparency into their pipelines from the get-go.