The free online encyclopedia that the volunteers of Wikipedia and others have created is a rich source of information, but its audience, especially in some languages, is shrinking. Human pageviews have fallen around 8% year-over-year, according to a senior product leader at the Wikimedia Foundation (the estimates are ranges, it says). The decline is due in part to AI-generated search summaries and an increasing habit of fact-checking using social video platforms rather than going directly to the open web.
The decline became clearer after an update to Wikipedia’s bot-detection systems found that a chunk of the spring traffic surge was coming not from people but from evasive automated activity. And, when the noise was removed, the underlying trend suggested that far fewer humans were clicking through. The explanation, Wikimedia contends, is simple: search engines are providing more and more answers to questions right on the results page — “zero-click searches,” in the industry argot — and younger users are taking their queries directly to TikTok, YouTube Shorts and other feeds of videos instead.
AI answers and the zero-click squeeze on Wikipedia
Generative AI is the new showcase in search. Rather than displaying blue links, mainstream search engines are increasingly providing a direct answer at the top of the page, frequently by relying on sources such as Wikipedia. That makes getting a summary faster for users — and it enables them to leave without clicking through. Independent analyses from Similarweb and SparkToro have pointed out for some time now that more than half of web searches end without a click already, and AI summaries exacerbate this zero-click pattern.
Search providers counter that AI overviews can still pull in visits, and that the prominence of citations encourages more substantive reading. They also say they drive significant traffic to reference sites, and that high-quality answers engender trust among users. But even if outbound links stay, the weight of attention is moving upstream to the results page. Where the synthesis allays most casual curiosity, fewer people come to read full articles, check references or add corrected information.
There is also a lack of provenance. The facts that AI systems spew out often come from many sources, which hides where the underlying claim itself might be founded on volunteer-written Wikipedia content. That opacity erodes the feedback loop that has traditionally maintained the accuracy and breadth of knowledge in the encyclopedia.
Social video pulls the next generation of searchers
The other headwind is social video. Discovery, for younger users at least, now begins in a vertical feed — not in a browser’s address bar. Pew Research has found a drastic increase in the proportion of U.S. adults ages 18–29 who regularly receive news on TikTok, and the Reuters Institute has recorded an overall movement toward creators and influencers as information intermediaries.
In reality, that means a question that once would have prompted a click to Wikipedia — what is a superconductor, who was this leader, how does a runoff election work — now gets resolved with a 30-second explainer. Some of the creators cite and link to their sources, but many do not. When Wikipedia does tell the behind-the-scenes story, we never read the article in question to check a fact or track down a citation.
The stakes for the digital commons and open knowledge
Traffic is not a vanity metric for an open encyclopedia. That drop in traffic has consequences on the margin: Fewer visits can mean fewer new editors finding out how to contribute, slower improvement of articles in underrepresented topics and languages, and a thinner pipeline of small donors who sustain the infrastructure. The virtuous cycle the Wikimedia model relies on is that readers become contributors, contributors enhance content, and better content attracts more readers.
There is also a risk in quality if more information is consumed by the public that is detached from its sources. AI systems can also be overly confident in their incorrect conclusions, and with no clear way back to citations, errors become more difficult for readers to catch and fix. The encyclopedia’s emphasis on verifiability and clear sources helps to arrest that drift, but only if platforms and users themselves maintain it.
What Wikipedia will do next to sustain its ecosystem
Wikimedia leaders are saying they’re open to new education models, even as they encourage AI, search, and social platforms that depend on Wikipedia to send more users back to the source. The organization is working on a new attribution system that would help normalize how reusers give credit and link back, as one prong of many projects, such as Wikimedia Enterprise and the work of API services to provide higher-quality content to large partners with clearer licensing and provenance.
Internally, growth and reader experience teams are developing features that help new editors get started more easily by editing articles rather than diving into the esoteric wiki-markup code; surfacing locally relevant content in readers’ own language projects; and streamlining references on mobile. The foundation had previously experimented with machine-made article summaries, but suspended the rollout after community feedback, showing a cautious approach that prioritizes the role of volunteers and accuracy.
Ordinary users are not exempt from this call to arms. When a search result or a quickly viewed video answers your question, check out the citations and click through to see where the material comes from. If the source is Wikipedia, think about editing or donating. The health of the information commons depends not only on AI’s capacity to summarize, in other words; it relies, as well, on the human systems that produce and vet and supply the knowledge being summarized.