The SEO community has been up in arms this week as Google has appeared to have disabled a certain parameter that was used by various bots / tools.
Background
A ‘URL parameter’ is probably up there with ‘canonicalisation’ for one of the more obscure phrases that you are only likely to run into if you are a bit of an SEO nerd.
For the more normal readers, a URL parameter is essentially a piece of data that is added to a URL, normally after a question mark, that provides extra information to the web server that allows it to serve dynamic content. They are often used for filtering, or sorting, content that is displayed. E.g. an online store may add a parameter such as ‘category=silver-rings’ if the page is showing silver rings.
Until recently, you could add ‘&num=100’ to a Google search results page to display more than the default ten(ish) organic search results. I would be very surprised if many actual human users did this, as you would simply scroll through the paginated results, but most software tools or bots would use this hack to display more organic search results on one page.
What is the impact of using &num=100?
Perhaps the most notable by-product of software asking for the extended results is that sites ranked in low positions would see artificially inflated impressions figures. Whilst a human is unlikely to scroll past two or three results pages, the bot driven ‘searches’ would give the impression that lowly ranked sites are attracting more interest than they actually are.
I admit that I have sometimes doubted the accuracy of Google Search Console data that suggests a very high number of impressions for a page ranking in very low average positions.
This is definitely a trend that seems to have accelerated over the past year or so.
Another key factor of 2025 has been the appearance of AI overview results. This has had a very negative impact on click through rates (CTR) on traditional organic listings where the user has their question answered on the search results page. There is no longer any need to browse other sites to find the information that they are looking for.
It should be noted that CTRs have not dropped for all searches, but I have seen some fairly staggering drops for certain search terms, most often top of the funnel / generic information searches.
Any drop in organic search traffic offers triggers panic for the SEO bods amongst us, but I have been comforting myself with positive average rank data in Google Search Console and, crucially, improved year on year visibility that is indicated by increases in impression data. I just assumed that the drop in CTR is entirely down to the arrival of the AI overviews.
The disabling of ‘&num=100’ would suggest that, perhaps, the increased impression numbers should not be trusted.
What has been the impact of disabling &num=100?
The most immediate impact of Google clamping down on the parameter hack has been the apparent apocalypse in rankings.
Where a site no longer appears on ‘page 1’, which very rarely actually shows ten results, most rank checking platforms will now suggest that the previously ranked page has dropped off the radar entirely.
Cue a significant panic.
When I panic, I normally turn to Google Search Console. This is especially true for anything ‘ranking’ related, as I do not like talking about rank when there is such a strong level of personalisation within organic search results. The ‘average rank’ in Google Search Console is, in my humble opinion, a much better metric and *should* iron out odd personalised results, which could impact the rank checking platform that you are using.
In every specific example I looked at in more detail, Google Search Console was not reflecting the catastrophic ranking drop that I was seeing in rank reports.
Once the disabling of the &num=100 parameter was evident, this all made sense and, whilst it presents some short term challenges, it is reassuring to know that everything was not as worrying as it appeared when staring at a sea of red numbers on rank charts.
A more interesting trend, however, has been the drop in page impression data in Google Search Console. Although not universal, a lot of the accounts that I have been looking at have indicated a sudden drop in page impressions.
The only logical explanation for this is that a big chunk of previously reported impressions were down to bots, not actual humans. Whilst this should not have been a great surprise, I have been somewhat startled at the extent to which this may have been the case.
The good
I am often critical of the wider SEO community for getting their knickers in a twist about *any* change.
If you step back, I think most would agree that having data that far more accurately reflects human behaviour is a good thing.
A lot of digital marketing metrics fall into the ‘vanity metric’ camp and I would humbly suggest that feeling smug after huge increases in impressions is a bit short sighted if you are not actually attracting more human brand visibility.
In that regard, I welcome this change.
The bad
Every silver lining has a cloud…
Whilst I welcome the advent of (hopefully) more accurate user data, it is near on impossible to compare year on year data in Google Search Console now.
I used to bang on about how online marketing is so accountable, but the reality is that is becoming harder and harder to measure everything with as much accuracy as we would like.
Privacy concerns and the challenges of legislation such as GDPR have been a huge challenge in recent years, but it feels like a big knock in the teeth when the goalposts shift in this manner. It is incredibly frustrating not to be able to measure the impact of marketing initiatives, nor to be able to have a consistent data set that allows you to prove the value of your work.
Conclusion
My personal take on this debacle is that it is great that Google is taking steps at removing artificial data from web analytics, despite the headaches that result when looking at year on year data. It would be nice if this could be retro-fitted to previous years, but that is not going to happen.
It has been a eye-opener regarding the scale of bot traffic that has clearly been inflating impressions in Google Search Console. That actually reduces the impact of AI overviews on CTR.
As the dust settles on this particular furore (the rank checking platforms are already finding workarounds), I think it should be a good reminder to focus on the metrics that REALLY matter and those that you can have more confidence in.
For example, should you actually care about impression data in Google Search Console or a ranking report if you can see that you have 50% more high quality web enquiries on your website, when compared against last year?
It should be a good reminder to have a good think about what you want your website visitors to do when on your site and configure conversions / key events in whatever analytics platform you use to measure these actions. Most importantly, make sure that you are comparing web analytics stats to actual data, e.g. how many enquiry forms did you receive compared to what GA4 is suggesting?
I have used the analogy before, but conversion data tells you what is actually happening in the same way that a thermometer will give you a temperature reading. Impression / ranking / etc. data is all more akin to a barometer, which will give you a trend and hint at likely performance.
Both have their merits, but I would humbly suggest that you should really focus on the actual measured performance of your website rather than what a 3rd party platform is telling you.