GA4 Named Its AI Assistants Channel Around the 70.6% It Can't Actually See
Google Analytics rolled out a new AI Assistants channel on May 13, 2026, automatically tagging traffic from recognized chatbots like ChatGPT, Gemini, and Claude with the medium ai-assistant and grouping it in default reports. The channel only captures sessions that arrive with a recognized referrer header. ChatGPT's Atlas browser and similar in-app webviews strip those headers, so most ChatGPT clicks still land in Direct.
What Google actually shipped in the May 13 update
GA4 now has a fourth dimension value that automation can sort against. Sessions from recognized AI assistants land with medium=ai-assistant, default channel group=AI Assistant, and campaign=(ai-assistant). The change was announced in Google's "What's new in Analytics" entry on May 13 and applies to Default Channel Group reports without configuration. There's no toggle, no property setting, no migration step. The bucket just appears the next time the report refreshes.
Google did not publish the recognized-referrer list. The help-center entry names ChatGPT, Gemini, and Claude as examples but doesn't say which other tools (Perplexity, Copilot, You.com, Phind, claude.ai web sessions versus Claude API surfaces) are inside or outside. MarTech had been arguing for this exact bucket since last summer, which is roughly the timeline most analytics community discussions point at as the moment GA4's Default Channel Group started looking obviously broken for AI traffic. From what I've seen across community threads since the rollout, Perplexity referrers seem to be bucketing correctly because Perplexity passes perplexity.ai as a referrer cleanly. ChatGPT and Claude, less so. The pattern looks consistent with how Google's "Organic Social" channel works: a private match list, updated silently, no public roadmap.
The 70.6% problem GA4 didn't actually solve
The new channel only works when a referrer header arrives. That's the whole game. According to our earlier breakdown of last-click attribution against AI traffic, roughly 70.6% of AI-adjacent visits in one analysis of just under 450,000 sessions landed in GA4 as Direct because no referrer header came with them. The new channel cannot recover any of those. The bar that gets filled inside the AI Assistant row is whatever fraction of clicks happened to keep their headers intact on the way out.
MarTech confirmed the mechanism specifically for ChatGPT's Atlas browser and Perplexity's Comet: Atlas strips referrer headers on most outbound clicks, so sessions show up as Direct or (not set), while Comet usually passes perplexity.ai through cleanly. Both are now consumer-grade AI browsers shipping inside their respective apps, and Atlas in particular is what most ChatGPT power users click through on mobile.
So when Google says "you can now identify how users are discovering your site through chatbots like ChatGPT," what they actually mean is: you can now identify the subset of ChatGPT sessions that came through plain-web ChatGPT inside a third-party browser. The Atlas-native sessions, which seem to be growing fastest, stay invisible. That's the part most launch coverage skipped, and it's the part that matters when you're trying to size the channel for a media plan.
Why the recognized-referrer list won't be published
Google's pattern with channel detection is to maintain a private match list they update silently. The Default Channel Group rules for Organic Social, Affiliates, and Email have all worked this way for years. Nothing about the AI Assistant launch suggests they'll handle this one differently. The result: when a new AI tool ships and starts driving real clicks, your AI Assistant channel won't catch it until Google adds the referrer pattern on their side. There's no way for you to push a new entry in. You wait.
That probably matters more in the next twelve months than it would in any other moment. The list of consumer AI tools driving actual web clicks went from roughly three to about twelve in the last year. Some will get added quickly. Some will sit in Referral or Direct for months. From what I've seen, the major ones (ChatGPT, Gemini, Claude, Perplexity, Copilot) will likely be on the list by Q3, and the long tail won't be until Q1 2027 at the earliest.
The setup most analytics teams should run this week
If your team relies on GA4 channels to report AI-driven traffic, the default AI Assistant channel is necessary but nowhere near sufficient on its own. Build a custom channel group above Referral that catches what Google's list misses.
In Admin → Data settings → Channel groups → Custom channel groups, add a rule called "AI Assistant (Extended)" that matches source against a regex covering perplexity.ai, you.com, phind.com, copilot.microsoft.com, bing.com/chat, claude.ai, and any other AI tool you actually see in your referrer report. Place this rule above the Referral rule so it captures first. KP Playbook's writeup has a working regex and the full UI walkthrough.
That captures the visible long tail. To catch the invisible part, the Atlas-stripped, Direct-bucketed sessions, the only real solution is content-level analytics. Pages that pick up more Direct traffic immediately after appearing in an AI Overview or chatbot citation are doing AI-driven work, even if GA4 can't tag the session. Microsoft's Clarity Citations tab, now free, does this server-side without depending on referrers, and it'll show you the citation event upstream of the click.
The numbers worth labelling before leadership quotes them
A few stats are floating around the GA4 announcement coverage, and they need labels because they aren't interchangeable. PPC Land's coverage cited industry data showing 63.6% of marketers receive some AI tool referral traffic, ChatGPT accounts for about 0.21% of total website traffic versus Google's roughly 40%, and AI search visitors convert at rates around 23x organic search visitors.
That 23x figure is the headline most teams will quote in a deck, and it's the most easily misread. The 23x is a conversion rate, not conversion volume. ChatGPT sends roughly 190 times less traffic than Google does, so a 23x higher rate on a 1/190 base still produces dramatically less total revenue. The AI Assistants channel will make this exact contrast more visible inside GA4, which is useful, but it'll also make it easy for leadership to over-index on the conversion-rate number without the volume context. If you ship a deck this week with the 23x stat, put the absolute session count next to it, on the same slide, in the same font size.
The boring audit I'd start with before touching channel groups
Honestly, I'd start with the boring audit before opening Channel Group settings. Pull last 30 days of Direct traffic by landing page and sort by sessions descending. Pages that are ranking in AI Overviews, getting cited in ChatGPT answers, or are popular knowledge-graph entities will float to the top. That list is roughly the inventory of pages that just got more visible inside the new AI Assistant channel, and the same pages where the Atlas-stripped traffic is still hiding inside Direct.
Then layer the new channel on top. Compare AI Assistant sessions to the Direct-suspect set you already identified. The delta is the size of the referrer-stripping problem at your specific properties, and it's the number to bring to anyone asking "should we invest more in AEO." On paper, the new channel sounds like it answers that question on its own. In practice, it answers the question for whichever fraction of users happened to keep their headers attached.
The launch is small, but it's the first time Google has acknowledged in product that AI assistants are a distinct traffic source rather than an aggregator quirk. The reporting story still has gaps wide enough to drive an Atlas through. I'd rather start working with a partial answer this week than wait for Google to ship the rest.
Notice Me Senpai Editorial