Liz Reid Repeated Google's 'Bounce Clicks' Defense Without Showing the Data

Liz Reid Repeated Google's 'Bounce Clicks' Defense Without Showing the Data
Reid's argument hinges on a click pattern Google has yet to chart in public.

Liz Reid, Google's head of Search, has used the phrase "bounce clicks" three times since August 2025 to explain AI Overview traffic losses, most recently on Bloomberg's Odd Lots on April 23, 2026. She shared no supporting data in any appearance. Independent studies measured a 61% organic CTR drop, an 8% click rate with AI Overviews versus 15% without, and a 34% global publisher decline.

What "bounce clicks" actually claims

Reid's argument, as laid out in her August 2025 Google blog post and repeated in the Bloomberg interview, runs like this. AI Overviews don't reduce useful traffic to publishers. They reduce a category of click Reid calls "bounce clicks": a user types a question, clicks the top result, sees the answer in two seconds, and slams the back button. The page didn't help. The publisher didn't earn anything from the visit anyway. AI Overviews surface the answer directly, the user is satisfied, and the bounce nobody wanted gets removed.

Total organic click volume, in Reid's words from that blog post, is "relatively stable year-over-year" and "average click quality has increased." On Bloomberg she added, "If all you were going to do was go to the webpage, see the fact, and immediately click back, you're going to spend like a half a second on the page."

It is a clean argument. It is also a fully unfalsifiable one until somebody at Google publishes a chart, and that has not happened.

The data Google didn't show

I want to be fair to Reid for a second. The argument is plausible in theory. People do click results and bounce back. AI Overviews can plausibly siphon some of those off. The question is whether the siphon is small enough that the net effect on quality clicks is neutral, or whether it's large enough to take meaningful traffic with it on the way out.

Search Engine Journal pointed out the obvious gap: across the August blog, the October Wall Street Journal interview, and the April 23, 2026 Bloomberg appearance, Reid hasn't shared a single year-over-year comparison, percentage, or chart. Eight months of repeating the same framing with no underlying numbers attached.

If the data made the case, you would expect the chart by now. Marketing teams know this from the inside. When a deck is missing the slide, the slide isn't missing.

What three independent studies measured instead

While Google has been talking, three groups have been counting.

Seer Interactive's September 2025 study tracked 3,119 informational queries across 42 organizations, covering 25.1 million organic impressions and 1.1 million paid impressions between June 2024 and September 2025. For queries with an AI Overview, organic CTR fell from 1.76% to 0.61%. That's a 61% drop. Paid CTR on the same queries fell 68%, from 19.7% to 6.34%. The bright spot, and it's a small one, was that brands actually cited inside the Overview earned 35% more organic clicks than they did before. The ranking-but-not-cited middle is where the bottom fell out.

Pew Research ran a different experiment, watching how 900 U.S. adults actually behaved across 68,879 Google search queries. When an AI Overview appeared, users clicked a traditional result on 8% of visits. When no Overview appeared, they clicked 15%. Click-throughs to links inside the Overview itself were 1%. Pew also found users were more likely to end the browsing session entirely on AIO pages: 26% session-end versus 16% without.

Then there's Chartbeat's December-to-December 2025 data via Press Gazette. Google Search referrals to publishers fell 34% globally and 38% in the US between December 2024 and December 2025. Discover referrals fell 15%. Small publishers (1,000 to 10,000 daily page views) absorbed the worst of it, with the same Chartbeat data set showing 60% declines over two years versus 22% for large publishers.

Three different methodologies, three different sample bases, all pointing the same direction. None of them prove Reid wrong by themselves. Together they make the position that nothing meaningful is happening hard to hold.

How GSC reads if you take Reid at her word

Here's the part that actually changes how you work.

If you accept Reid's framing, the implication is that you can't use Google Search Console to evaluate AIO impact on its own. GSC reports clicks. It doesn't report dwell time, scroll depth, or what happens after the click. So if total clicks are down on a page that ranked first for an AIO-eligible query, Reid would say you're seeing the bounce-click drain, not a real loss. That's not crazy as a hypothesis. It's also not testable inside GSC.

The burden of proof, in other words, just got handed to you. Google's claim is that the clicks you lost were the ones you didn't want. Until Google publishes data, the only way to evaluate that on your own site is to layer GSC against an analytics tool that captures engagement, then look query by query.

From what I've seen on accounts that have run this kind of audit, the picture is rarely flat. Engagement metrics on the surviving clicks usually look fine, sometimes a touch better. But total engaged sessions almost always drop too. Which suggests the bounce clicks are leaving and the quality clicks are leaving too, just at different rates. Reid's framing seems to be partially right and substantively misleading at the same time.

A 30-minute audit before you assume the data is lying

Pick the 50 queries in GSC that lost the most clicks year over year. For each, pull the corresponding landing page's GA4 (or Plausible, or whatever you run) data over the same window. Look at three numbers.

  1. Sessions from organic search.
  2. Engaged sessions, or sessions over 30 seconds, depending on your tool.
  3. Conversions (form fill, purchase, scroll-to-end, whatever you actually count).

If sessions are down but engaged sessions and conversions are roughly flat, Reid's bounce-click story is approximately accurate for your site. AI Overviews are filtering noise. Live with it.

If all three are dropping in similar proportions, you're losing the clicks that earn you money, and Reid's framing is doing PR work, not analytical work. Plan accordingly: chase AIO citation explicitly, accept lower volume from this channel, or shift effort to the queries where AIO doesn't trigger.

That sounds obvious. The reason it's worth saying is that I keep seeing teams take Reid's word for it and stop measuring. The whole rhetorical move of "bounce clicks" is to give people a reason not to look at the data they already have. Look anyway. The 30-minute version isn't hard. The story you find in your own analytics is the only one that matters when you walk into next quarter's planning meeting.

We covered the AI Overviews CTR rebound that turned out to be a Google impressions bug earlier this month, and the lesson there is the same. When Google's framing of the data and the data itself disagree, the data wins.

One more "bounce clicks" press cycle without numbers behind it, and the framing is going to start sounding more like a brand campaign than a research finding. The publishers who got cut don't need a better story. They need a chart.

Notice Me Senpai Editorial