AI Overviews Took 265 Million Monthly Clicks in One Country. The Category Breakdown Tells a Different Story.

AI Overviews Took 265 Million Monthly Clicks in One Country. The Category Breakdown Tells a Different Story.
SISTRIX March 2026 data reveals AI Overviews cut Position 1 CTR by 59%, but the impact varies wildly by content category.

SISTRIX dropped their March 2026 AI search analysis last week, and the headline stat hit hard: 265 million organic clicks per month are disappearing in the German market alone because of AI Overviews. Position 1 CTR collapsed from 27% to 11% when an AI Overview appears. That is a 59% reduction.

Those numbers are going to be in every SEO newsletter this week. And on their own, they will lead most teams to the wrong conclusion.

The aggregate picture looks catastrophic. Overall organic click rates drop from 57% to 33% on queries where AI Overviews show up. Search Engine Journal reported the same findings with additional context: 79% of German AI Overviews appear above the organic results, which means most users never even scroll to Position 1 anymore. Gemini's market share hit 22% as of January 2026, up from 5.3% twelve months earlier. The trajectory is obvious.

But the part that matters more, and that most coverage is burying, is how unevenly this hits.

Recipe Sites Lost 1%. Health Content Lost 29%. That Spread Is the Actual Story.

SISTRIX's category breakdown is where the useful information lives. Chefkoch, Germany's largest recipe site, saw a roughly 1% traffic decline from AI Overviews. Booking.com saw 0.46%. Meanwhile, health portals like NetDoktor lost 29-30%, and parenting content sites saw 24% declines.

Wikipedia got hit hardest in absolute terms at 31.6 million lost clicks per month in Germany, but as a percentage of its total Google traffic, that is only about 5%.

The pattern becomes obvious once you see it laid out like this: AI Overviews eat informational queries where the answer is factual and self-contained. "Symptoms of strep throat" gets a complete answer in the Overview. "How to make beef bourguignon" gets a recipe card that still makes you click through because cooking is a sequential process you can't absorb from a summary paragraph.

There is a real analogy here to the old SEO distinction between navigational and informational queries, except now the line runs between "summarizable" and "requires engagement." One side of that line is getting hammered. The other barely notices.

Transactional queries, structured data-heavy content, and anything requiring extended engagement are largely untouched. This is not a uniform tax on publishers. It is a category-specific extraction of content that Google can satisfactorily summarize in three sentences.

The Modeled Number vs. What Your Dashboard Actually Says

This is where I think a lot of the coverage gets sloppy. The 265 million figure is modeled, not measured. SISTRIX extrapolated from CTR curves applied across 100 million-plus keywords to estimate the aggregate click volume shift. That is methodologically sound for understanding macro trends, but it does not tell any individual site what happened to their traffic.

A Digiday analysis from earlier this year pegged the overall publisher referral traffic decline at 25%. Separately, Press Gazette reported global Google traffic to publishers was down 33% year-over-year, with the US specifically down 38%, citing Chartbeat data.

But some of those "lost" clicks were zero-click searches that never converted anyway. A user who searched "what year was the Eiffel Tower built" was never going to click through to a publisher and subscribe or buy anything. That click was already worthless from a business standpoint, and counting it as a "loss" inflates the alarm. On paper, losing a million clicks sounds devastating. In practice, it depends entirely on what those clicks were worth.

The more useful question for any individual site: what percentage of your keywords now trigger AI Overviews, and what is the conversion value of the traffic those queries used to send? If 40% of your keyword portfolio triggers AI Overviews but those queries had a 0.1% conversion rate, the actual revenue impact is probably noise. If 15% of your portfolio triggers Overviews but those were your highest-converting informational queries driving email signups, that is a real problem you need to address this month.

The 20% Number That Quietly Changes the Math

Only about 20% of German search keywords currently trigger an AI Overview. That means 80% of queries remain completely unaffected.

For SEO practitioners who are not in a decimated category, this means the aggregate statistics are making the situation sound worse than what they are experiencing in their own dashboards. I would be genuinely curious to see how many teams panicking about the 265 million number have actually checked their own keyword exposure rate. My guess: not many.

But that 20% is growing. Twelve months ago, it was considerably lower, and Google's investment in AI infrastructure suggests they are pushing it higher. The company's TPU cost advantage (reportedly up to 5x lower production costs than competitors for AI responses) removes the biggest constraint on expanding AI Overviews to more query types.

AdExchanger framed this well: the structural threat is not today's 20%. It is what happens when that number hits 50% or 60%, and it reaches your category. Some local publishers are already adapting through collaboration and first-party data strategies. Others are still reading aggregate stats and assuming the problem is elsewhere.

What a Practical Keyword Audit Looks Like Right Now

I think the useful response to this data is not panic but portfolio-level diagnosis. And honestly, I am surprised more coverage is not suggesting this rather than just reporting the scary aggregate number.

Pull your top 200 keywords by traffic value (not volume, value). Run them through SISTRIX or any AI Overview detection tool. Flag which ones currently trigger an AI Overview. Then segment them into three buckets:

Exposed keywords: Triggers an AI Overview where your position has dropped or CTR has declined in the last 6 months. These need content restructuring toward formats AI cannot easily summarize. Interactive tools, primary research, step-by-step processes with images, opinion-driven analysis. The goal is making your content impossible to condense into three sentences.

At-risk keywords: Informational queries with no AI Overview yet, but the content format is a straight factual answer that Google could easily summarize tomorrow. These represent your next 6 months of exposure. Start building alternative traffic sources now, whether that means diversifying beyond Google or investing in direct audience channels.

Safe keywords: Transactional, navigational, or complex informational queries where click-through is inherent to the user's need. Keep doing what you are doing.

The audit takes a morning. The category breakdown in the SISTRIX data gives you a reasonable prior for which queries are most vulnerable, but your own Search Console data is the final word on whether the aggregate applies to you.

This Probably Gets Worse Before Anyone Admits It Has Stabilized

SISTRIX founder Johannes Beus said something worth sitting with: "The traffic that now flows to Google will never come back." That is a strong claim, and my read is a bit softer. Or maybe more specific. I think the traffic that flowed to sites providing factual summaries will never come back. That is gone. The traffic that flowed to sites providing analysis, tools, unique data, and genuine expertise was never at serious risk from AI Overviews, because a three-sentence summary cannot replace those experiences.

The uncomfortable part is that a lot of publisher traffic was in that first bucket. And most SEO strategies over the past decade were optimized for exactly the type of content AI can now generate and display without a click. That model is done in categories where AI Overviews have landed, and it will be done in more categories as Google expands coverage.

I keep coming back to the recipe site data. Chefkoch lost 1%. That is not survival through luck. It is survival through format. Nobody reads a recipe in a search result snippet and then cooks dinner from memory. The content requires engagement by its nature, and that engagement requires a click.

The sites that will look like Chefkoch's numbers a year from now are the ones whose content similarly cannot be consumed as a three-sentence summary. Everyone else should probably stop waiting for the aggregate numbers to get better.