Epic Games CEO Tim Sweeney is at odds with game store “Made with AI” labels, saying that flagging games for AI involvement is pointless because soon the technology will be integrated into almost every part of game creation. Store tags aren’t helpful to players, he said in a post on X — they’re an increasingly awkward fit for how AI would actually be used.
Sweeney argues that disclosure is appropriate at places like art exhibitions and licensing markets, where buyers need to know about rights and authorship, but not on storefronts marketing finished games.
With AI now infiltrating everything from art pipelines and QA tooling to dialogue prototyping and voice synthesis, he thinks the blanket label does more harm than good.
His remarks come in the midst of a larger culture war. Some creators and players argue that AI-heavy production devalues the human craft; others understand AI as a neutral tool — to them, indistinguishable from game engines or Photoshop — that doesn’t merit a warning sticker. Sweeney’s position is clearly in line with the latter camp: tell people about their rights and risks when it is relevant, but do not stigmatize the tool itself.
Sweeney’s Argument for Widespread AI in Games
The Epic chief has long been bullish on AI’s upside. Speaking to IGN earlier this year, he mentioned how advancements in generative tools will make Zelda-sized ambitions accessible for small teams, and bring worlds usually reserved for AAA budgets within the grasp of 10-person studios. He also foresees that totally new genres will emerge from AI-assisted systems.
Under that vision, AI stops being a feature and becomes infrastructure: accelerating asset creation, supporting developers, stress-testing balance, or crafting bespoke content. A storefront label in a world like this tells you almost nothing about how a game was made or if its content is responsibly sourced.
How Stores Are Using AI Today for Game Listings
Transparency is being attempted in a variety of creative ways by platforms. Steam, for instance, in early 2024 started querying developers about which uses of AI they employ and who owns it. The forms seek to differentiate between tools that are used within development and AI content in the final product, as well as provide players with context when generative systems could be brought in to produce or modify content at runtime.
The number of labeled games has risen rapidly. According to research from Totally Human Media, 7,818 titles on Steam revealed the use of generative AI, which accounted for 7 percent of its library of around 114,126 titles; that’s up from 1 percent last year. That surge sheds light on why a basic “AI” label runs the risk of becoming common and not particularly useful.
Industry Divide and Consumer Confidence in Gaming
Not everyone is leaning in. Nintendo and Obsidian Entertainment have stated they’re steering clear of generative AI for now, reflecting concerns about quality, legal liability, and brand identity. Voice actors and artists, too, have pushed for consent and compensation guarantees; unions such as SAG-AFTRA have fought for protections around voice and likeness cloning in interactive media.
Disclosure advocates argue that knowing about games with political underpinnings helps players make values-based retail choices and may also warn them of potential IP or moderation issues. Critics maintain that top-level tags obfuscate critical nuances: Is AI on hand to pen the barks actors later rerecord, or does the game generate its voices on the fly with models trained on unlicensed data sets? If too general, a tag can be both shamefully accruing to certain people’s legitimate actions and fail to alert us to actual dangers.
From Labels to Real Disclosure for Game Stores
Experts say a more effective approach would move away from labeling tools and focus on rights and risks disclosures. Examples include whether training data was sourced from licensed sources, which synthetic voice or image generation in shipped content was used, and how player data interrelates with the in-client models. That echoes Sweeney’s own point about art and licensing contexts — prove out the chain of rights, and alert users when it changes something in a material way.
Clear definitions also matter. Stores might differentiate between use for development only (e.g., concept art ideation, code assistant) and in-game music generation or AI NPC dialogue, and demand more detailed metadata on the latter. That granularity would then allow players to know what they’re getting into without slapping a one-size-fits-all term on entire games.
What It Means for the Epic Games Store and Players
Sweeney did not detail any changes to the Epic Games Store’s policies. But his remarks sharpen a critical question for platforms: do broad “AI” stamps help buyers, or should stores insist on particular disclosures related to rights, safety, and user experience? As the pace of adoption of AI accelerates and regulatory scrutiny increases, look for standards for disclosure to evolve away from a binary tag to something more narrow and arguably more informative.