Google's AI Overviews Are Killing Publisher Traffic. Now What?


The numbers are worse than anyone publicly admits.

I’ve spoken to half a dozen Australian publishers in the past month, and every single one reports the same thing: Google traffic is cratering. Not declining gradually—dropping off a cliff. One mid-sized news site showed me analytics indicating a 34% drop in Google referrals since AI Overviews expanded last year.

This isn’t the first time Google has hurt publishers. But it might be the most significant.

What’s Actually Happening

Google’s AI Overviews synthesize information from multiple sources and present answers directly in search results. Users get what they need without clicking through to publisher sites.

For informational queries—the kind news publishers depend on—this is devastating. “What happened in the bushfire?” gets answered right there in the search results. Why click through?

The click-through rate on searches with AI Overviews is, according to multiple studies, roughly half what it was for traditional search results. And AI Overviews now appear on a growing percentage of queries.

Publishers are cited in these overviews. Sometimes. But a citation without a click generates no ad revenue, no subscriber conversion opportunity, no direct relationship with the reader.

The Traffic Breakdown

Not all traffic is affected equally. Here’s what I’m seeing:

Most impacted:

  • Breaking news summaries (AI can synthesize from multiple sources)
  • Explainer content (answering “what is” queries)
  • List-based articles (easily synthesized)
  • Service journalism (recipes, how-tos, guides)

Less impacted:

  • Deep investigative work (can’t be summarized easily)
  • Opinion and analysis (people want the specific voice)
  • Local coverage (often not enough sources for AI to synthesize)
  • Exclusive reporting (only one source exists)

The irony is stark: the commodity content that pays the bills is disappearing, while the expensive journalism that doesn’t generate as much traffic is more defensible.

Publisher Responses

Different publishers are responding differently. Here’s what I’m seeing:

The legal route. Some are joining lawsuits or advocating for regulatory intervention. The argument: Google is appropriating content without fair compensation. This may eventually succeed, but the timeline is years, not months.

The partnership route. Others are negotiating directly with Google and other AI companies. The deals vary widely—some reportedly generous, others insultingly small. Perplexity, OpenAI, and others are also making partnership offers.

The technical route. Some publishers are blocking AI crawlers, using robots.txt to prevent their content from appearing in AI training or synthesis. The problem: this may hurt more than it helps if it reduces visibility entirely.

The adaptation route. Others are accepting the change and pivoting strategy—focusing on content AI can’t easily replicate, building direct audience relationships, reducing dependence on search traffic.

The Strategic Questions

If you’re a media leader, you’re facing difficult choices:

Should you block AI crawlers?

The argument for: don’t help train systems that undermine your business.

The argument against: visibility matters; being invisible to AI may be worse than being cited without clicks.

I’ve talked to strategists who advise complete blocking and others who advocate maximum visibility. The honest answer is nobody knows which approach is right yet.

Should you pursue AI partnerships?

The deals being offered vary enormously. Some publishers report offers that are “insulting”—five or six figures annually for content worth millions. Others have negotiated meaningful compensation.

My view: negotiate hard, but negotiate. Being left out entirely may be worse than taking imperfect deals. The publishers with the most leverage are those with unique content that AI systems need.

Should you restructure content strategy?

Almost certainly yes. The content that’s being commoditized by AI should decline in investment. The content that remains defensible—original reporting, unique perspectives, community connection—should increase.

This is easier said than done. Service journalism generates traffic; reducing it affects near-term revenue. But continuing to invest in content that AI is consuming feels increasingly irrational.

What I’m Advising

When publishers ask me what to do, I offer this framework:

Short term: Monitor carefully. Track which content categories are most affected. Don’t make dramatic changes based on incomplete data.

Medium term: Diversify traffic sources aggressively. Newsletter subscribers, app users, direct visitors—these matter more than ever. Search dependency is a vulnerability.

Long term: Invest in defensible content. Original reporting. Local presence. Expert voices. Content that can’t be synthesized because it doesn’t exist elsewhere.

The publishers who’ll survive aren’t those who fight the change most effectively. They’re those who adapt fastest to a world where search traffic is no longer reliable.

The Technology Response

Some publishers are exploring technical approaches to make their content more AI-resistant or more visible.

Approaches I’ve seen include:

  • Structured data optimization for AI citation
  • Content gating that requires clicks for full information
  • Paywall strategies that limit what AI can access
  • Partnerships with AI strategy support firms to develop custom technical solutions

None of these are silver bullets. But publishers who understand the technology—not just the editorial—are better positioned to respond.

The Broader Implications

What’s happening with Google is a preview of what’s coming across the information ecosystem. AI systems that can synthesize information change the value of information itself.

If information can be assembled, summarized, and presented without visiting sources, the sources become invisible infrastructure rather than destinations. Journalism becomes a substrate rather than a product.

This has profound implications for the economics of news production. Someone needs to fund the original reporting that AI systems depend on. If that funding evaporates because attention moves upstream, the quality of information deteriorates.

Google, to their credit, seems to understand this—hence the licensing deals and publisher partnerships. But whether those arrangements adequately compensate for lost traffic remains unclear.

Looking Forward

The publishers I’m most optimistic about share certain characteristics:

  • Diversified revenue: Not dependent on any single traffic source or monetization model
  • Direct relationships: Large owned audiences via email, apps, direct visits
  • Distinctive content: Work that can’t be synthesized because it doesn’t exist elsewhere
  • Technical sophistication: Understanding of AI systems and how to work with or around them

Media organizations that understand how AI systems work—whether through internal expertise or partnership with specialists in this space—have significant advantages in navigating these changes.

Google’s AI Overviews aren’t going away. They’ll get better, appear more often, and answer more queries. Publishers who accept this reality and adapt accordingly will survive. Those who don’t—won’t.

The traffic decline is real. The question is whether you respond with denial or strategy.


How is your organization responding to declining search traffic? I’m collecting case studies for a longer piece on publisher adaptation strategies.