Small Publishers Lost 60% of Search Traffic. Large Publishers Lost 22%.

Small Publishers Lost 60% of Search Traffic. Large Publishers Lost 22%.
Chartbeat data via Axios, March 17, 2026: the decline tripled between large and small publishers.

Chartbeat measured search traffic declines across thousands of publisher sites from May 2023 through early 2026. Publishers under 10,000 daily page views lost 60% of their search referrals. Publishers over 100,000 daily lost 22%. AI chatbot referrals grew more than 200% in the same window and still account for under 1% of overall publisher page views. The gap between small and large nearly triples, and chatbots are nowhere close to filling it.

The 22 to 60 spread tells you which publishers had a moat

Axios broke the Chartbeat data on March 17, 2026, and the first thing worth sitting with is the shape of the spread. A 22% decline is painful. A 60% decline is structural. SEO skill does not explain a spread that size. What explains it is which publishers had non-search discovery channels in place when AI Overviews showed up.

Conde Nast CEO Roger Lynch told Axios that search fell from a majority of the company's visits to about 25%. Total revenue still grew, through subscriptions, events, and licensing deals. That is what a moat looks like when the water drops. The small publishers Chartbeat is tracking (the 1,000 to 10,000 daily page view cohort) did not have that infrastructure, which means every point of search decline flowed straight through to the traffic graph without getting absorbed elsewhere.

Reuters Institute's 2026 survey of 280 media executives put a number on where this is heading. Publishers expect another 43% decline in search referrals over the next three years, with one in five expecting losses above 75%. That is the baseline for planning, not the doom scenario. Most small publishers I've watched are still using recovery assumptions that the Reuters respondents have already abandoned.

The long tail was the small-pub strategy, and AI eats it first

The Ahrefs study that ran across 300,000 keywords found AI Overviews cut the position-one organic CTR by 34.5% on informational queries. The updated December 2025 data pushed that number to 58% for some query types. The loss is not uniform, though. It concentrates on the "how do I," "what is," "why does" long tail. The type of query that used to send a small publisher 50 to 200 visits a day across thousands of pages.

From what I've seen in traffic reports since last summer, that is exactly the query surface small publishers were built on. Long-tail informational content was the honest SEO play: write something genuinely useful for a low-competition query, build topical depth across a niche, get ranked, compound the traffic over a few years. The pitch made sense for more than a decade. AI Overviews killed the economics in about eighteen months, because an AI summary delivers the same answer without the click.

The short version: small publishers did not lose because they got worse at SEO. They lost because the type of query they ranked for is the type AI Overviews intercept most aggressively. Large publishers were never as exposed to that long tail in the first place, which is half the reason their decline stopped at 22%. They also had branded search, which AI Overviews still barely touch.

Blocking AI crawlers made the bleeding worse

This part surprised me. An analysis cited in the PPC Land coverage found that publishers who blocked AI crawlers lost 23% of total traffic and 14% of human traffic. Sites that allowed AI access actually held human traffic better.

The mechanism seems to be this: AI Overviews still pull from your rankings regardless of crawler access, but they drop the attribution link when you block. Readers who might have clicked through from an Overview citation to your original reporting never see the link, because there is no citation to click. Blocking does not remove you from the summary. It removes you from the credit.

There is also a second-order effect worth flagging. Some blocking configurations accidentally cut off Google's own indexing signals, because the same robots rules apply across crawler types. That is an implementation problem, not a strategy problem, but it turns up in audits more than it should. If your traffic loss is already bad and you are considering blocking AI access as a protest or a hedge, the data says it costs you roughly another 20% on top of what you have already lost. The Bauer shutdown in Germany is what the terminal version of that curve looks like.

The playbook that scaled to traffic does not scale to readers

The numbers stop telling you what to do at this point, and the strategy question takes over. Conde Nast can pivot to subscriptions because it has brand equity built through Vogue and The New Yorker. A gardening blog with 5,000 daily visitors cannot. That is the uncomfortable asymmetry buried inside the 43% decline number that publishers are forecasting.

The industry conversation has moved toward "diversify referral sources" as if that is a strategy. For large publishers with existing direct audiences, it probably is. For small publishers, "diversify" usually means build an email list from scratch, ship a newsletter, try to get TikTok views, maybe launch an app. Each of those is its own full business with acquisition costs that get harder every month. Telling a small publisher to diversify is roughly equivalent to telling them to start over.

What is actually left as a playbook for anyone under 100K daily page views: assume search is a secondary channel within two years, build direct relationships with the readers you have now, and treat every rankings-driven visitor as a one-time audition rather than a funnel. Email opens and app opens are how people come back. Search does not owe you a return visit anymore. On paper, that sounds harsh. In practice, it is probably closer to the 2019 newsletter strategy than anyone in SEO wants to admit.

The audit worth running this week

The honest question is not "how do I recover search traffic." It is "which pages on my site were only ever going to work as long as Google sent long-tail traffic." That content is not coming back. No consolidation trick fixes a demand shift, which is also what ClickUp discovered across its 2,815 blog posts.

The rebuild starts with the subset of pages that still generate email signups, referrals, or returning visits. The stuff people remember reading. That is a much smaller surface than a small publisher's full archive, but it is the surface that survived an AI Overviews rollout, a Helpful Content Update, and a Discover shakeup inside the same twelve months. Pages that already earn retention are the ones worth deepening. Everything else is a sunk cost that has to be reframed.

Most small publishers I've watched in 2025 made two mistakes in sequence. They kept publishing the same long-tail informational content through most of the year. Then, when traffic dropped, they tried to rank for higher-intent queries against larger competitors and lost there too. The 60% number is what it looks like when both of those happen at once.

I don't think the answer is to write less. It is probably to write with a different proof point: will this piece still have readers if Google stops referring anyone to it? If the answer is yes, it is worth writing. If not, the hours go further into the newsletter or the channel that still pays.

Notice Me Senpai Editorial