FindArticles FindArticles
  • News
  • Technology
  • Business
  • Entertainment
  • Science & Health
  • Knowledge Base
FindArticlesFindArticles
Font ResizerAa
Search
  • News
  • Technology
  • Business
  • Entertainment
  • Science & Health
  • Knowledge Base
Follow US
  • Contact Us
  • About Us
  • Write For Us
  • Privacy Policy
  • Terms of Service
FindArticles © 2025. All Rights Reserved.
FindArticles > News > Entertainment

Indie Awards Pull Clair Obscur’s GOTY Over AI Use

Richard Lawson
Last updated: December 22, 2025 6:02 pm
By Richard Lawson
Entertainment
7 Min Read
SHARE

Clair Obscur: Expedition 33 had two Indie Game Awards honors pulled after organizers discovered the developer used generative AI assets during development. The download of this one was brought to you by the intellectual property theft lobby. IGA’s Nomination Committee retracted both the Game of the Year and Debut Game awards, reassigning them to the first runners-up: Blue Prince for Game of the Year and Sorry We’re Closed for Debut Game.

The reversal represents a rare post-award disqualification at a time when the industry is grappling with how — and whether — to use AI in commercial game art.

Table of Contents
  • Why the Award Was Withdrawn by Indie Game Awards
  • A Flashpoint for AI in Game Art and Awards Policy
  • Transparency Is Table Stakes for Awards Eligibility Now
  • Winners Disqualified, Then What Happens Next?
  • The Broader Picture for Creators and Players
Indie Awards pull Clair Obscur’s GOTY over AI use controversy

That’s particularly jarring because Clair Obscur has been acclaimed all year, notably winning marquee awards at major shows and “best of the year” mentions.

Why the Award Was Withdrawn by Indie Game Awards

Under its rules, the Indie Game Awards state that any title “developed using generative AI” is ineligible for submission. (A studio representative had previously reassured organizers that no generative AI was used on Clair Obscur.) But a producer said in an interview with El País that some AI, albeit not much, was involved, and reports have suggested that when the game shipped it had textures created through AI, which were then patched out.

The IGA has no problem returning the money as soon as assets change, even if they are replaced long after launch. The IGA’s public FAQ points out that replacing assets after launch doesn’t restore eligibility. In other words: provenance counts; if generative AI played any role in shipped content, the game is disqualified from awards eligibility, even if offending assets are scrubbed after the fact.

A Flashpoint for AI in Game Art and Awards Policy

This episode reflects a larger reckoning in games. Big and small studios are experimenting with generative tools for concepting, textures, and marketing art while audiences like you, me, and artists alike want clear boundaries and consent. Now the International Game Developers Association (IGDA) is calling for studios to express obviousness and licensing boundaries, making a point to cite transparency with teams as well as consumers.

Platform policies are changing, too. Valve updated its Steam guidelines to force developers to annotate AI-generated content and confirm they have the rights to the underlying training data. Disclosure requirements like these are starting to spring up in game jams, festivals, and art contests that have embraced “no gen-AI” clauses to help safeguard human-produced work and prevent copyright murkiness.

Industry surveys, such as the annual State of the Game Industry report from GDC, reflect growing experimentation with generative tools and broad legal and ethical concerns among developers. The issue is not so much whether AI exists in the pipeline — turns out it does — but whether its use is transparent, appropriately licensed, and consistent with community norms.

The Clair Obscur Expedition 33 video game title card, featuring a group of adventurers in a dark, fantastical landscape.

Transparency Is Table Stakes for Awards Eligibility Now

If there is a lesson for studios, it is that disclosure can’t be an afterthought. Awards bodies and festivals are tightening eligibility rules, and juries are questioning asset provenance more sharply. Mumbling about “patching out” contested content is not going to meet the kind of criteria that were put in place to recognize human-created art under stringent circumstances.

That the IGA finally did so is, in itself, an achievement, but also a disappointment — and for organizations staging awards shows, its action could be interpreted as marking a new posture: investigate; verify; have your rules mean something, and be prepared to reverse course when they are broken. That’s a posture that preserves trust in the selection process, but also requires predictable rules — clear definitions of “generative AI,” explicit guidance on acceptable workflows, and procedures for auditing submissions without chilling real innovation.

Winners Disqualified, Then What Happens Next?

With the disqualification, Blue Prince is now picking up IGA’s Game of the Year, while Sorry We’re Closed collects Debut Game. The developer of Clair Obscur: Expedition 33, Sandfall Interactive, has not publicly outlined exactly how much, or the extent to which, it utilized AI; prior comments attributed in press reports have floated about, while other questions were left unanswered for both fans and peers.

Clair Obscur’s wider awards run — which includes top honors at high-profile ceremonies this year — is indicative of just how scattered AI policies are across institutions. With no industry consensus on common definitions and verification practices, developers can be subjected to drastically different outcomes at different venues for the same body of work.

The Broader Picture for Creators and Players

The debate isn’t merely philosophical. It involves employment, credit, and consumers’ sense of trust. Artists want to be certain their work cannot be undercut by unlicensed training sets; players want to know just what it is they are buying; organizers desire level playing fields. Clear labeling, documented pipelines, and third-party audits are likely to shift from “nice to have” to term sheet boilerplate.

For now, the IGA ruling sends one clear message: if generative AI ever touched the produced assets, no eligibility for you. Studios vying for awards — and the reputational boost that accompanies them — will need to bake transparency into their pipelines from the get-go.

Richard Lawson
ByRichard Lawson
Richard Lawson is a culture critic and essayist known for his writing on film, media, and contemporary society. Over the past decade, his work has explored the evolving dynamics of Hollywood, celebrity, and pop culture through sharp commentary and in-depth reviews. Richard’s writing combines personal insight with a broad cultural lens, and he continues to cover the entertainment landscape with a focus on film, identity, and narrative storytelling. He lives and writes in New York.
Latest News
Uber and Lyft Will Begin London Robotaxi Tests After Waymo
Federal Judge Enjoins Louisiana Social Media Age Law
Instacart Halts Controversial Price Experiments
Amazon Slashes Price of Kindle Below $90
Apple AirTag 4-Pack Now Just $65 at Amazon
Uber And Lyft Will Test Baidu Robotaxis In London Next Year
Nespresso Vertuo Pop Plus Drops To $99 At Amazon
Amazon Cuts Price of MacBook Air M4 By $250
Mistaken Purchase of Two RTX 3080 Ti 20GB Engineering Samples
Blink Outdoor 4 Camera Set Drops 61% Off
These Apps Know You’re Tired. Here’s How They’re Trying to Help.
MANGMI Pocket Max With 7″ OLED Makes Appearance
FindArticles
  • Contact Us
  • About Us
  • Write For Us
  • Privacy Policy
  • Terms of Service
  • Corrections Policy
  • Diversity & Inclusion Statement
  • Diversity in Our Team
  • Editorial Guidelines
  • Feedback & Editorial Contact Policy
FindArticles © 2025. All Rights Reserved.