Table of Contents▼
I've been staring at search results lately and something feels off. The pieces all work, but the whole thing feels hollow. Like walking into a store where everything's stocked perfectly but nothing's actually for sale. We're in February 2026 and the numbers are worse than most people realize. When AI Overviews show up in search results, 80-83% of searches end without anyone clicking anything. You type something in, get an AI-generated summary at the top, and bounce. Google calls them "AI Overviews," but what they really are is the final nail in the coffin of the open web.
Here's the thing people keep dancing around: the internet's mostly fake now. Over half of all English content online is AI-generated. Actually, 74% of newly published web pages contain AI content as of April 2025. Merriam-Webster named "AI slop" their Word of the Year for 2025, defining it as "digital content of substandard quality typically generated in large volumes through artificial intelligence," because it's everywhere. Pinterest and YouTube had to add filters just so people could block AI content because users were revolting. Security firm Imperva found that bot traffic crossed 51% in 2024. That means more than half of internet activity is automated, and that was before AI generation hit mainstream scale.
The way people search has completely flipped too. It used to be "how do I fix this" or "where can I learn about that." Now it's "do this for me." Voice search, visual search through Google Lens hitting 10 billion searches a month, conversational queries where you're basically talking to an assistant. 71% of people prefer voice search because typing feels like too much work. We've gone from seeking guidance to demanding execution. The shift is wild: 31.6% of AI-triggered queries are question-based, and queries with 4+ words are way more likely to trigger AI answers that keep you locked in place. Google retrained user behavior so thoroughly that even searches without AI Overviews saw a 41% click-through rate drop year-over-year. People just expect instant answers now.
And the damage is real. When AI Overviews appear, organic click-through rates collapse by 58-61%. Paid ads? Down 68%. On mobile the zero-click rate hits 77%. The New York Times saw organic search traffic drop from 44% to 36.5% over three years. Some publishers report losing 20-90% of traffic in the past year. Companies like Chegg are literally suing Google, claiming Google coerces publishers into surrendering content for AI training purposes as a condition for search inclusion, then uses that content to destroy their traffic. The lawsuit filed in February 2025 basically says "give us your stuff for free or disappear from search entirely." This goes beyond some abstract SEO problem. The entire incentive structure of the web is breaking down. Why create good content if nobody sees it?
The authenticity crisis runs deeper. Dead Internet Theory used to be a conspiracy thing, this idea that most online activity is bots and corporate algorithms instead of humans. Now it's just reality. Synthetic media is indistinguishable from real stuff at scale. AI agents mimic tone and emotion convincingly enough that you genuinely wonder who's real. Traditional verification systems are failing. Trust in news, institutions, even peer-to-peer communication is collapsing because we've lost the ability to authenticate anything. You see a photo, read an article, watch a video, and your first thought is "is this even real?" That's the default now.
But here's what actually bothers me: it's all the same. AI-generated content goes beyond fake. It's generic. Homogenized. Every article sounds like every other article because they're all trained on the same data, optimized for the same engagement metrics. Platforms prioritize virality over authenticity. The algorithm feeds you what it thinks you want based on what worked for someone else, and it all blurs into this beige sludge of information that technically answers your question but leaves you empty. We've traded diversity of thought for scalable content production, and the internet's becoming an echo chamber where every voice sounds like the same robot. When everything's trained on the same corpus, optimized for the same metrics, and generated by the same models, you get this weird monoculture where nothing feels original anymore.
The survival pattern tells you everything you need to know. Differentiated, branded content survives while generic SEO farms die. Publishers like Dotdash Meredith and Ziff Davis claim minimal impact because they built actual brands people trust. Meanwhile, content farms that spent 20 years optimizing for Google's algorithm are getting obliterated. We built this ourselves. Publishers created mountains of SEO-optimized garbage designed to rank instead of inform. Google trained users to expect instant answers instead of exploration. AI slop is just the logical endpoint of "content as SEO vehicle" meeting "infinite scalability."
So where does that leave publishers and creators? Trying to adapt to a game where the rules changed overnight. Only 1% of users click links cited in Google's AI Overviews. ChatGPT's 1.2 billion referrals between September and November 2025 still only account for 1% of total publisher traffic. The math simply does not recover.
Some are trying to optimize for AI citations instead of clicks. Getting cited inside the AI Overview becomes the new game. Structure content so AI can extract and attribute it: answer upfront in the first paragraph, clean HTML structure with FAQs and lists, schema markup to help AI categorize your content, topic clusters instead of exact-match keywords. The thinking shifts from driving traffic to gaining visibility. Think billboard advertising instead of direct response. You're building brand recognition even if users never click.
Others are diversifying away from search entirely. Newsletters, subscriptions, apps, anything that builds owned audiences. Some publishers signed content licensing deals with AI companies. The Atlantic, News Corp, Washington Post partnered with OpenAI. The New York Times signed with Amazon. You get upfront payments plus royalties for AI training usage. A few publishers are blocking AI crawlers entirely, choosing to disappear from AI results rather than let their content get used for free.
The publishers surviving this are rebuilding business models that work without Google. Licensing deals, subscriptions, owned platforms. The open web traffic model feels cooked. Traditional SEO tactics like keyword density, backlink volume, classic on-page optimization become irrelevant when users never leave Google. Quality content alone means nothing if nobody sees it.
Some are going legal. The Independent Publishers Alliance filed EU complaints demanding transparency on AI content usage and impact assessments. Regulatory intervention might force change, but that's a 2-3 year timeline minimum. Meanwhile, Google's roadmap includes more AI features: AI Mode international expansion, voice-activated queries, multi-turn conversations. The trend accelerates instead of reversing.
So where does that leave us? Stuck between a search engine that keeps you trapped and a web full of content nobody wrote. The tools got better but the experience got worse. We're asking more questions than ever and getting fewer real answers. The anti-slop sentiment is real. The frustration's building. But the fix probably comes from users migrating to closed communities, newsletters, Discord servers, and spaces where provenance matters. The open web might just stay dead while smaller trusted networks flourish. That's already happening. You see it in the rise of Substacks, private Slacks, and curated feeds over algorithmic ones.
The question goes beyond whether the internet needs to change. It's whether enough people care to build something different, or whether we just accept this as how things work now.
Sources: