Sam Altman Is Worried About the Dead Internet He Built
Sam Altman tweeted something remarkable last September. "I never took the dead internet theory that seriously," he wrote, "but it seems like there are really a lot of LLM-run twitter accounts now."
The CEO of OpenAI, the company whose chatbot kicked off the largest flood of synthetic content the internet has ever seen, looked around at the mess and expressed surprise. The internet reminded him, with considerable enthusiasm and a lot of Tim Robinson hot dog suit memes, that he is directly responsible for most of what he's complaining about.
Normally I'd file that under "tech CEO discovers consequences." But this one matters for anyone running content marketing right now. Because the dead internet isn't a conspiracy theory anymore. It's a measurement problem, and the measurements are getting worse.
The numbers stopped being theoretical about six months ago
Originality AI has been tracking AI-generated content in Google's top 20 search results since 2019. Back then, 2.27% of top results were synthetic. As of their latest measurement, that number sits at 17.31%. That's roughly an eightfold increase, and most of it happened after ChatGPT launched.
But search results are only part of the picture. Gartner reported that 6 in 10 newly indexed web pages now contain primarily synthetic text or images. Imperva's 2025 Bad Bot Report put bot traffic at 51% of all web traffic, crossing the majority threshold for the first time in a decade. Bad bots alone accounted for 32% of all internet traffic.
These aren't fringe estimates from AI doomers. They're from companies whose entire business model depends on getting the measurement right.
The practical effect for content marketers: the channel you've been publishing to (the open web, distributed through Google) is getting noisier at a rate that makes your individual contribution harder to distinguish. Not impossible. Harder. And in marketing, "harder" usually just means "more expensive for the same result."
Your audience can tell, and they're starting to punish it
Audience enthusiasm for AI-generated content dropped from 60% in 2023 to 26% in 2025. Not a gradual decline. A cliff.
A separate study from SmythOS found that 73% of consumers say they can identify AI-generated marketing content. Whether they're actually right about what's AI and what isn't is almost beside the point. The perception itself creates a trust penalty. When people believe content is AI-generated, they engage with it less, trust it less, share it less.
Nielsen's data still shows user-generated content is trusted roughly 2x more than brand messaging. I suspect that gap is widening now, though I haven't seen updated numbers that isolate the AI variable specifically. From what I've seen in industry Slack channels and Reddit threads, the skepticism is running hot. People are pattern-matching on certain phrases, certain structures, certain rhythms that feel synthetic. "Delve" became a punchline. Scientific American reported that ChatGPT's favorite vocabulary started showing up more frequently in conversational podcasts, which tells you something about how deep the linguistic contamination goes.
And honestly, "contamination" isn't quite the right word. It implies the AI content is hidden or subtle. A lot of it isn't. It's just obvious. And boring. And everywhere.
The 4.1x gap that should worry content teams
So here's the part that should actually inform how you spend your content budget. AI content with human strategic oversight performs 4.1x better than fully automated output, based on research tracking content performance across hybrid and fully automated workflows. Think about that multiplier for a second. The same topic, the same keywords, the same publish cadence, but 4x the performance because a human actually shaped the final output.
Most of the debate I follow (on Reddit, not in consulting calls, because I'm going to be honest about where my sample comes from) breaks into two camps. Either teams went full autopilot and published 10x more content last year, or they kept their human editorial process mostly intact and used AI to speed up research and first drafts. The second group seems to be doing better. But the measurement is messy because the teams in camp one rarely have the analytics infrastructure to know they're underperforming.
Only 19% of content marketers track AI-specific KPIs, according to a 2026 study. So 81% of teams using AI in their content pipeline have no structured way to know whether it's helping or hurting. Most of that is a management gap, not a tooling one.
The channel that doesn't have a dead internet problem
This is probably a good moment to point out what isn't getting flooded with synthetic content: your email list. Your subscriber base. Your first-party audience.
Email still has the highest ROI of any marketing channel. (That's a standing NMS opinion, but it also lines up with basically every benchmark study published in the last five years.) And unlike organic search, where you're competing with an exponentially growing volume of AI-generated pages for a fixed number of ranking slots, email lands in an inbox. The competition there is attention, not algorithmic filtering.
The dead internet problem is fundamentally a distribution problem. If your content strategy depends on the open web discovering your article through Google, you're fighting against a 17% (and growing) synthetic content baseline in the results, plus AI Overviews that reduce clicks to top organic results by 34.5%. If your content strategy delivers directly to an audience that opted in, you've sidestepped the entire issue.
I'm not saying SEO is dead. There are still real opportunities in organic, especially for highly specific, expert-level content that AI generators produce poorly. But the economics are shifting. The cost of winning an organic position is going up while the value of that position is going down. If you're not building a first-party audience alongside your search strategy, you're building on ground that shifts a little more every quarter.
The budget rebalance most teams are putting off
First, audit your content pipeline for the trust penalty. Pull your last 90 days of published content. How much of it was AI-generated with minimal human editing? Check the engagement metrics on those pieces versus your human-edited work. If you're not tracking that distinction yet, start now. The 4.1x performance gap should show up in your own data if you look for it.
Second, make authorship visible. Real names, real titles, real editorial perspectives. This is one of the few trust signals that's genuinely hard to fake at scale. MarketScale's analysis suggests provenance systems (visible authorship, AI-disclosure policies, contributor programs) are becoming table-stakes for brands that want to maintain credibility. If your blog posts say "by [Brand Name]" with no human attached, you're already at a disadvantage.
Third, accelerate your owned-audience investment. Newsletter subscribers, community members, direct relationships. The 31.4% of marketers reporting their biggest performance decline in organic search and SEO aren't imagining things. The channel is getting harder, and it's getting harder because of a structural shift, not a temporary algorithm fluctuation. Own the distribution and the dead internet becomes somebody else's problem.
Altman probably won't fix this
The funniest part of Altman's tweet is that the internet's primary response was the Tim Robinson hot dog suit meme: a man in a costume standing in front of a wrecked car asking "we're all trying to find the guy who did this." It was deployed so many times in the replies that it should probably count as a primary source at this point.
But whether Altman feels sincere remorse or is performing concern for the cameras doesn't change the conditions on the ground. Bots are the majority of web traffic. Most new pages are synthetic. Consumer trust in AI-generated content is actively declining. And the teams that treat content as a volume play, just publish more because it's nearly free, are going to discover that nearly free content produces nearly zero results.
I think the real risk isn't that the internet "dies" in some dramatic way. It's that the version of the internet where content marketing works as a reliable growth channel just keeps getting smaller. By the end of this year, I'd estimate at least a quarter of search-dependent content programs will report negative ROI once you factor in the editorial oversight they skipped. The dead internet isn't really about whether the internet is dead. It's about whether your content is distinguishable from everything a bot published this morning. For a lot of teams right now, honestly, it isn't.