Search Console frustration

The slow erosion of my trust in Google Search Console

Is this the start of the end for one of my favourite data sources?

By 

published on 

Google Search Console has long been one of my preferred sources of data. 

It has felt like the closest thing we get to a “source of truth” from Google itself and I have always preferred the concept of ‘average rank’ for measuring visibility in the SERPs rather than relying on ranking reports.

But, if I am honest, it is proving to be a little frustrating recently and is starting to raise some uncomfortable questions about the real value of the data it provides.

When the numbers stop adding up

The first cracks started to appear around September 2025. Many of us logged in to find what looked like catastrophic drops in impressions. In some cases, sites appeared to lose 20–50% of their visibility overnight. 

Naturally, panic followed. Except… nothing else changed.

Traffic in Google Analytics held steady. Rankings appeared stable (or even improved). Conversions carried on as normal. The only thing that had changed was the reporting in Search Console.

The explanation, which I wrote about in a previous blog post, was that Google removed support for a parameter that allowed tools (and bots) to scrape deeper search results. As a result, a large chunk of what had previously been counted as “impressions” simply disappeared from the data. 

On paper, that sounds like an improvement in accuracy.

In reality, it created a fairly fundamental problem: you can no longer compare historical data with any real confidence. Pre-September and post-September data are effectively measuring different things.

For a tool that is so often used for trend analysis, that is… not ideal.

The “stuck data” problem

Another infuriating issue that seems to have developed over the past year is the fact that Search Console data has simply stopped updating data. Entire reports have frozen for days at a time, rendering it effectively useless for recent performance analysis. 

Google has acknowledged some of these issues, including delays to indexing and performance reports that lasted weeks before being resolved. It sort of sounds minor, but it really is not.

If you are trying to assess the impact of a site change, a content update, or even a suspected algorithm shift, a lag of several days (or longer) makes the tool borderline redundant for decision-making.

You are left flying blind.

Missing data and indexing confusion

More recently, there have been growing reports of missing or inconsistent indexing data within Search Console.

Pages that are indexed appear as not indexed. Reports lag behind reality. URL inspection tools give conflicting signals. Even Google’s own representatives have had to step in to clarify what is “normal” versus what is actually broken

At the same time, broader Google bugs and volatility continue to muddy the waters, with confirmed serving issues and gaps in reporting adding to the sense that things are not entirely under control. 

None of this inspires confidence.

Average ranking position

I mentioned above that the approach that Search Console takes to reporting on SERP visibility, which is based on an average ranking rather than an absolute one, has always been my preferred tool of choice when (begrudgingly) analysing rankings. 

I say begrudgingly as I have never been a big fan of ranking as a metric for success for SEO initiatives. A helpful barometer, no doubt, but certainly not the accurate thermometer that some believe it to be.

This is another area where I have, unfortunately, started to lose my confidence. Whilst there is no doubt that the SERPs can be very volatile, some of the reports that I have seen in recent months just don’t feel possible. 

As an example, here is the reported average rank for a fairly competitive fintech-related term that is important for a client:

wild SERP volatility

As you can see, it has been really quite chaotic in recent weeks / months. The relative stability of the actual traffic to the site would not support this level of extreme movements, so it becomes very hard to trust the data and, more importantly, forces me to question the value in using it to share performance indicators with clients. 

A tool built for clarity… now creating ambiguity

The irony is that Search Console was supposed to simplify things. Instead, it increasingly feels like another layer of interpretation.

Is that drop in impressions real, or just a reporting change? Is that indexing issue genuine, or just delayed data? Is performance flat, or is the data simply not up to date?

Even Google’s own messaging often leans towards reassurance: don’t panic, it’s just a reporting glitch. 

But the simple truth is that reassurance starts to wear thin when the data is subject to seemingly constant glitches, or long term analysis becomes impossible due to different approaches to data collection.

Is this the demise of Search Console?

“Demise” might be too strong a word.

Search Console is still incredibly useful. It still provides insights you cannot easily get elsewhere. Crucially, it is still one of the few direct lines into how Google sees your site.

But I cannot deny emitting a few long sighs recently and my trust in it has certainly been rocked. It feels unloved and I worry about the direction of travel. It is a subtle, but important, shift. 

What is the conclusion?

To be honest, the recent hiccups simply reinforce a point that we have known for a while, but perhaps ignored because Search Console was so convenient.

No single data source should be trusted in isolation.

Search Console data needs to be triangulated with analytics platforms, third-party tools and actual business outcomes. 

It can contribute to the story, but it should not be the entire story (however much I may have praised it in the past) as the past year has proven that Google’s own data is not immune to confusion.

Enjoy this post?

Sign up to Browser Media Bytes for similar posts straight to your inbox.

BM Bytes Sign Up