The internet’s equilibrium has been upended, according to a new industry analysis: By 2025, more than half of newly published articles may be at least in part created by automated software. The finding, from the SEO consultancy Graphite, underscores how swiftly automated writing has gone from experiment to default for many publishers, marketers and affiliate sites.
According to researchers at Graphite, AI-written articles recently overtook human-written ones by volume and were once responsible for an estimated peak of more than 55 percent of new articles. Though the share doesn’t seem to be growing, rather stabilizing than ascending, that new baseline indicates a structural change in how information is created on the internet.
What the New Analysis Found About AI-Written Articles
AI-written articles have exploded since large language models were set loose on the world and there has been a dramatic increase from a sizeable minority to a solid majority of new posts, according to Graphite. The company is basing its prediction on pattern analysis and classifier-driven signals on a large swath of newly published pages.
The researchers make the critical observation that pages written by AI aren’t testing consistently in search. They are less likely to appear near the top of search results or in answers from major AI assistants, a vacuum that may be dissuading publishers from filling the web with even more content. In other words: There’s never been this much AI text, but there’s actually less of it visible where discovery is occurring.
Why AI Content Floods the Web and Fuels Mass Publishing
The economics are irresistible. Generative tools are able to generate passable copy in minutes, enabling operators to spin up product roundups, how‑to guides and news rewrites at a clip that has started to exceed what humans can do. It’s that efficiency, which has touched off a rush of low-cost, high-volume sites chasing ad revenue and affiliate commissions.
Legitimate publishers, too, have tried their own experiments with automation — some of which they’ve shared publicly and some not. Incidents where outlets secretly experimented with AI-written bylines or templated recaps — and then had to walk back the errors that resulted — highlight both the temptation and reputational peril. Media watchdogs like NewsGuard have compiled hundreds of AI news sites in different languages, many designed to appear human-administered.
How Search Engines Push Back on Scaled Low-Quality Content
Search engines have also cracked down on scaled, low-quality content — no matter if it was written by a human or an AI. Google downplayed “helpful content” principles and broadened spam policies to cover mass-produced pages, while saying that AI isn’t strictly against the rules but thin or misleading material is.
Graphite’s hypothesis sits on the same side of that debate: if AI-first articles underperform, the incentive is instead to hybrid workflows or human-led pieces with expertise and original value. In practice, this means that sites that depend entirely on A.I. to create lists and rewrites might publish more but receive less traffic and trust.
Quality, And Trust, Are Where The Fault Lines Lie
Public sentiment remains cautious. Few Americans regularly get news from AI and the vast majority never do, according to Pew Research Center data, with those who do indicating low trust. That skepticism sets a high bar for any publisher relying on automation to produce journalistic work in sensitive areas such as health, finance or civic information.
Detection remains a moving target. The more realistic they get, the harder it is for classifiers to rely on short or edited passages. That puts the pressure on downstream signals — original instances of reporting, citation, author identity and reader engagement — to distinguish substantive work from automated filler.
What Smart Publishers Will Do Next to Compete With AI
Graphite’s data is no call for “no AI,” but, rather, “no shortcuts.” Editors who view AI as a drafting or research tool in combination with human judgment, fact-checking and unique angles will likely outshine pure automation. The winning formula is built on depth: proprietary data, on-the-record sourcing, expert analysis and clear accountability.
Operationally, that includes implementing human-in-the-loop review, preserving the use of AI and doubling down on quality signals such as author pages, sourcing transparency and structured data. Brands should rigorously test: see whether AI-assisted pages actually generate rankings, links and conversions and don’t simply rely on volume as a proxy for reach.
The web has crossed a threshold where machines write most of what’s new. If Graphite’s discovery holds, then it will be visibility and trust that decide the winner — not raw slabs of text. In the land of endless text, what’s scarce are originality and expertise.