SEO attribution in 2025: Why it’s broken and what you can do

SEO attribution in 2025: Why it’s broken and what you can do

For years, SEO professionals have worked hard to earn a seat at the table. 

We’ve built dashboards, reported wins, and tried to translate complex user journeys into something business leaders could understand. 

We’ve tried – sometimes desperately – to connect the dots between what we do and the results the business sees.

Now those dots are disappearing.

Attribution has never been easy in SEO. 

And today, it’s becoming nearly impossible. 

Between AI-generated answers, completely changed user behavior, vanishing click-throughs, and broken analytics, we’re entering a phase where we know SEO’s influence is real, we just can’t prove it.

But what’s making attribution worse than ever, and what SEO teams can do to survive? Keep reading.

Attribution has always been hard

Attributing SEO efforts to actual results has never been straightforward. 

SEO is a long-term game – you don’t make a change and see impact the next day. 

It often takes weeks or even months for results to show.

By then, so many other things may have changed that it’s hard to know what really moved the needle.

We’ve always been in a state of guesstimation.

One of the biggest issues is isolating SEO from everything else happening at the same time. 

It’s rarely one thing that drives a traffic spike or performance drop. 

It could be:

  • An algorithm update.
  • A holiday.
  • A product launch.
  • Changes to the site you didn’t even know about.

In larger companies with many stakeholders, you can’t always control – or even track – what’s being changed. 

One of the first things I experienced in my current role was a homepage headline update I didn’t even notice. 

I was new and focused on catching up, unaware it had happened.

But constant change is just part of the problem.

More users are using ad blockers or opting out of cookies, making it harder to connect the dots. 

Then there’s Google. 

Anyone who’s tried to segment data in Google Search Console knows the numbers don’t always add up. 

Try filtering by page type or section, and the sum of the parts doesn’t equal the whole.

To make things worse, there’s often confusion around what drives organic traffic. 

A PR campaign or out-of-home ad might coincide with a spike, but that doesn’t mean it caused it. 

These awareness plays benefit all channels, not just organic.

The reason is simple: users don’t care if a result is organic or paid. They click whatever best matches their intent.

Between tech limitations, shifting user behavior, and internal coordination challenges, gaining a clear view of SEO’s real contribution has always been difficult, and it’s only getting harder.

Dig deeper: How to measure SEO success when AI is changing search

AI search has changed the user journey

Things have always been hard when it comes to SEO attribution. 

However, over the last year or so, it has reached a whole new level of uncertainty, largely due to the way people are now utilizing search engines and AI tools.

The old user journey used to be fairly straightforward:

  • Query → SERP → Click → Website → Conversion

Now, it looks more like this:

  • Query → AI reasoning → AI answer → Query → AI reasoning → AI answer → Decision made → (Maybe) Brand search → (Maybe) Website visit → (Maybe) Conversion

With AI Overviews in Google, which eliminate the need to open an additional app, users no longer need to visit a website to find the answer to their question. 

The summary is right there, above the search results – and often, it’s “good enough.” 

That single shift already reduces the chance of a click, even when your content is technically ranking.

And that’s just Google. 

For product research or more complex queries, people are increasingly using tools like ChatGPT, Perplexity, or Gemini to get curated responses. 

In that environment, both organic and paid traffic lose out, not only because your brand might not be relevant but also because the user might never even see your link.

Some will argue that this isn’t entirely new. 

Google has been offering information-rich features for years.

Knowledge panels, featured snippets, and other SERP enhancements have already reduced the need to click. 

However, in many markets – especially across Europe – we haven’t been exposed to the full range of these features until now. 

Because of GDPR and restrictions around personalization, many of the SERP features that shaped user behavior elsewhere weren’t launched at all. 

Now, AI Overviews are everywhere, and this time, their impact is more severe. 

The answers are longer, more complete, and much more likely to satisfy the user right there in the search results. 

And soon AI Mode will take this to space, adding on top additional layers of information, including such specific to you. 

To make things even worse, we’re losing visibility into how users make decisions. 

If your brand shows up in an AI-generated response, great. 

But often, users don’t click straight away. They might come back later and search for your brand. 

And when that happens, we don’t know what triggered their interest in the first place. 

Even worse, I’ve seen AI answers mentioning one brand but citing a competitor source. Just try to search for “Best XYZ” and you will understand what I mean.

The best part is that most of us rely on almost the same toolset with almost the same features. 

For years, we’ve worked around Google’s limitations – “not provided” keywords, missing data in GA4 – but at least we had some visibility. 

Now, most analytics tools don’t capture what happens in AI environments, and those that try to offer some insight are often too expensive. 

You can’t go to your manager and say, “Give me a few extra thousand dollars a month so I can maybe figure out how much traffic we’re not getting and potentially how to get back some part of it.” 

Most of us need to start small and hope for the best.

Paid search is going through similar challenges, but there’s a key difference. 

Platforms like Google have a direct financial incentive to solve attribution for ads. 

We’re already seeing experiments like sponsored slots in AI Overviews, and you can bet there are more to come.

For organic? Not so much. 

No one is rushing to build robust attribution pipelines for free traffic. 

And that means SEO teams are flying blind.

Dig deeper: Will Google’s AI Overviews kill the click?

Get the newsletter search marketers rely on.


Rebuilding SEO attribution: What to track when clicks disappear

What’s happening right now feels like a step back.

SEOs have worked hard to make the channel more measurable, to connect:

  • Actions with outcomes.
  • Traffic with revenue.
  • Organic growth with real business value. 

And just when we thought we were making progress, SEO armageddon came.

Now we’re told to:

  • Focus on “visibility” without traffic. 
  • Look at “influence” without attribution.
  • “Trust the long-term value.” 

It’s not that these things are untrue.

They’re just incredibly hard to sell to stakeholders who are still asking for dashboards and ROAS.

I will miss the days when, after optimizing a set of meta titles, we could expect an increase in the average position and an increase in traffic… It was so easy to explain.

But let’s try to think of the glass as half full and see how we can adapt.

Before we discuss adapting, it’s worth noting that if you want to connect SEO to results, you need to make sure you’re tracking the basics properly.

Traffic from AI platforms like Perplexity, ChatGPT, or Claude often comes through as referral in analytics. 

This is technically accurate, but easy to lose track of once mixed with other sources. 

To avoid this, it’s a good idea to create a custom channel grouping in your analytics platform. 

This way, all AI-related referral traffic can be tracked in one place. 

A regex filter for this might look something like:

.*gpt.*|.*chatgpt.*|.*openai.*|.*perplexity.*|.*google.bard.*|.*bard.google.*|.*bard.*edgeservices.*|.*gemini.google.*|.*gemini.*|.*copilot.*|.*claude.*|.*anthropic.*|.*deepseek.*|.*grok.*|.*qwant.*|.*mistral.ai.*

This ensures you can track AI referrals separately and spot early trends – even if click numbers stay small.

Also, make sure to do the same in your CRM setup so that any leads or conversions coming from these sessions are properly tagged. Otherwise, SEO loses the credit again.

After the basics, let’s see if there is some light in the darkness.

While we may not get direct attribution, there are ways to track engagement and see if our content and the efforts for optimising it are doing their job. 

Some of the signals we can look for are:

  • Time on site, engagement rate, and conversion rate (benchmarked against our own content).
  • Underperforming pages that fall below engagement baselines.
  • Feedback from sales or support about how users found the brand.
  • Growth in branded queries or product feature searches.
  • Users progressing across content in logical, funnel-like paths.

There are also some new metrics, on which we should start keeping an eye, measuring, and reporting.

1. Track AI visibility – not rankings and traffic

You might not get the click, but you can still show up in AI answers. 

If your content is clean, factual, structured, and has unique information, LLMs are more likely to use it. 

The reason is that they try to offer a wide range of information to the user. 

And if you are one of the few that offers something, you will probably make it in the AI results. 

If you’re a fan of Eli Schwarz’s product-led SEO concept, you’re probably already doing something similar.

Proving it? 

Best-case scenario, you track brand search lifts or use a tool that tracks a limited number of prompts. 

You can adjust your webforms by adding the question, “From where have you heard about us?”

If you work with a sales team, they can ask this question. 

Dig deeper: How to track visibility across AI platforms

2. Shift from keywords to search intent signals

Keywords don’t work the way they used to. 

Users don’t search linearly anymore – they ask AI tools. 

If your content doesn’t fit into that flow, you’re out.

Proving it? 

Maybe more branded queries that look vaguely like your page titles. 

Or longer phrases that appear in the Google Search Console data. 

3. Monitor AI mentions across tools and touchpoints

I am mentioning this as a separate item, mainly because many of the well-known SEO tools already show this information (for example, Semrush). 

But also because Google says that making it in the AI Overview is shown in Google Search Console data. 

Unfortunately, there isn’t a filter to show specifically these results. 

Proving it? 

Your best bet is a combination of an external tool that shows AI Overiews results and Google Search Console data for impressions. 

Dig deeper: 12 new KPIs for the generative AI search era

Additional tip

Keep in mind all the new things Google and other AI tools are launching for paid traffic. 

Remember, they need to prove that paying for ads is still meaningful, and it is highly probable that we will soon see some changes in this direction.

Final thoughts

SEO has never been easy. 

But at least we used to feel like we were learning the rules of the game – slow results, sure, but measurable ones. 

Now it feels like someone changed the game, moved the goalposts, and then set fire to the rulebook.

We’re being asked to navigate a system that’s less visible, less trackable, and far less accountable. That’s frustrating. Both for us and our stakeholders.

But people still search. 

People still discover. 

People still make decisions online. 

Our job now is to keep showing up even if the results are harder to trace, defend, and explain.

That means developing a new measurement mindset, one that’s less focused on direct attribution and more about user momentum. 

If your SEO is helping people move forward – discovering, engaging, and showing up in sales conversations – then it’s working. 

You just need to adjust how you track it.

Weighted attribution models, user engagement metrics, and anecdotal signals from internal teams might not be perfect, but they can be enough to justify the work and shape what comes next.

Dig deeper: Want to beat AI Overviews? Produce unmistakably human content