In the past week, I’ve noticed Google’s AI Overview creep into my search results pages on both mobile and desktop.
As I’ve highlighted previously, I’m not a huge fan of the feature and it’s served up some absolutely wild answers when it’s pulled data from shitposters on Reddit, which admittedly, has been pretty funny.
So now that Google AI Overviews (AIOs) have arrived in the UK, have the results improved, and what impact will it have on SEO?
Here’s what the AI Overviews currently look like:
On desktop, when clicking ‘show more’ the answer box will expand, and to the right of that, it will show the sources it’s cited as a reference. You can also click on the link icon next to each section of the answer to see where the information it is citing has come from.
I decided to take a look at which queries are showing AI Overviews, and which sources they cite. Were they relevant? Were they factually correct? And was Google favouring sites that already ranked well?
The test
I don’t have any fancy software that can pull data from SERPs that includes whether or not an AI Overview is shown, so I’ve had to do this research by manually searching random queries, seeing if they offer up an AI Overview, and then checking each source. For that reason, I’ve only been able to look at 50 different search queries.
As part of the test, I was also taking note of when AI Overviews didn’t appear at all, or as often, as well as what types of prompts show an AI Overview vs those that don’t. So I’ve actually performed searches well into the hundreds while conducting this research but didn’t have the time and resources to note down every single one of these.
There were some interesting patterns across different verticals and queries.
- Recipes/food – rarely show AI Overviews. No AI Overviews for ‘top ten vegetarian recipes’, ‘lamb mince recipes’, ‘how to make pizza dough’, which surprised me, though it did tell you how to boil an egg
- Where to visit/things to do/top attractions in X – limited AI Overviews. This was also kind of surprising as I was expecting an AI Overview to collate and summarise data for these types of queries
- General knowledge about people – not many AI Overviews are shown for specific people. It will provide info about the tallest man in the world but not ‘how tall is Taylor Swift’, ‘how many children does Elon Musk have’, ‘who is Ben Affleck married to’, or ‘how old Michael Jackson was when he died’. It would show for queries like ‘highest earning pop stars’ and ‘first man on the moon’
- Home/garden/craft – lots of AI Overviews. LOTS. If you’ve written content on how to do something that’s been ranking well, you might start seeing a big drop in traffic
- Health/fitness – lots of AI Overviews, which to me is slightly alarming given that we know that Google provides the disclaimer ‘This is for informational purposes only. For medical advice or diagnosis, consult a professional. Generative AI is experimental.’ which could be easily overlooked. On the plus side, it did not show an AI Overview when searching for things that could be deemed harmful, such as ‘quickest way to lose 20kg without exercise’ or ‘how to remove toenails’
- Finance – this was limited. It did show for queries including ‘how to open a savings account’ and ‘ISA vs bonds’ but nothing for ‘best bank accounts’ or ‘best mortgage rates UK’, but I’m glad it didn’t, as these numbers change all the time and would therefore likely be inaccurate.
- Motoring – a bit of a mixed bag. Did not show for some repair related searches, but did for others (no to ‘how to change a tyre’, yes to ‘fix chipped windscreen’ and ‘repair scratched alloys’) and shows for ‘top 10 most reliable cars’, but not ‘best electric cars’
- Retail – fewer AI Overviews, and ads were prominent for the vast majority of searches. None of the following queries served an AI Overview, even when less commercial (best washing machine under £300, best portable turntables, affordable wedding dresses, which Dr. Martens are most comfortable, best bikinis for 2024)
From these searches, I’ve found that the AI Overviews show less often when searching for queries including words like ‘top’, or ‘best’.
Next, I looked at search queries where an AI Overview was shown. I wanted to find out three things:
- Was the data relevant?
- Was the data factually correct?
- Did the sources cited already rank on page one?
Here are the results.
The results
Data relevancy
For the most part, the sources cited were relevant, with a few caveats.
For some queries, AIO was citing transactional sources, rather than informational ones, for example, links to product pages when the query was clearly positioned as a question seeking an answer.
In another instance, I saw the brand ‘Vets Now’ cited in the answer:
For the same query (Choking in dogs), one of the sources cited is about vomit colour – not choking. Related, sure, but there was no mention of vomit anywhere in the AIO and it did not appear on page one, so not sure why it chose to include it as a citation.
I also had some concerns about highly trusted and authoritative sources not being cited, especially for health-related queries.
For the query ‘how to reduce my calorie intake’, there were no citations for NHS, CDC or WebMD, despite all of these results appearing on page one. Instead, the AIO opted to cite fitness apparel and supplement sites.
I also found that there was perhaps some Google bias in the AIOs, with YouTube videos featuring heavily for some queries, such as ‘how should redheads do their eyebrows’. The video content was not particularly helpful and some were just product reviews with no real tips or advice.
For the search ‘adding multiple services to Google Business Profile’, the top four results were from Google, and only the first was relevant. There were much better sources not cited that appeared on page one that were not included.
Data accuracy
The accuracy of the data is much improved from what we saw a few months ago.
The only clanger I spotted was for the query ‘how to clean vinyl records’, which suggested using sandpaper for a smooth finish. The source cited was a YouTube comment ‘I like to clean mine with sandpaper and polish, gives a nice smooth finish.’
Some information was also outdated. The music artist with the most monthly listeners on Spotify as of August 2024 was Billie Eilish, not The Weeknd as the AIO stated.
There was a slightly alarming citation for ‘how to kill wasps’.
While I’m not doubting it would work, it appears at first glance that using gasoline might be a way of murdering our stripy, angry pals. However, when you visit the cited website, the post recommends that people absolutely do not try killing wasps by blowing them up with petrol. Nowhere in the top organic results is killing wasps with gasoline mentioned, and reputable sites such as Gardener’s World and Good Housekeeping are not cited by AIO. Thankfully, neither is Reddit, as some of the organic results shown sound like an accident waiting to happen.
I spend a lot of time searching for ‘who would win xxx’ queries, mostly because I like winning arguments in the pub. The query ‘which would win lion or tiger’ I expected to churn out some fairly random results. But citing a talk about leadership as a source was something I wasn’t expecting.
Cited sources SERP positions
In most cases, the sources cited were from sites that ranked well in SERPs. While there were a few exceptions, such as the query ‘which denim styles are in fashion now’, the AIO did not cite Vogue despite it appearing twice on page one, or Glamour, which was in position one.
The citations for ‘what can I drink to detox my body overnight’ were also a bit odd. Primarily, the sources shown were from Indian-based domains, including the Times of India (twice), an Indian supplement website, and another Indian domain showing slideshow content. In this instance, very few results from page one were cited.
The search ‘how to grow eyebrow hair’ cited even fewer page one results, in fact, just one from Healthline. The rest were beauty and holistic health sites, which suggested putting onion juice on your eyebrows. That’s a new one for me!
One search query in particular really produced some surprising citations with just a single site on page one being cited; ‘how to get position one on Google’.
Most of the big players in SEO were on page one organically – Ahrefs, Backlinko, WordStream, and even Google. But only one site appearing on page one, Hobo Web, was referenced in the AIO.
One of the citations was for a LinkedIn Pulse article from 2019, which has quite a few grammatical errors, typos and some inaccurate information – plus it’s about TV ads vs social media marketing.
Additionally, there is an AI company cited in the AIO. And I wonder if that content is 100% AI-generated.
Conclusion
After completing this research myself, I came across a bigger and much better study from seoClarity, which looked at 36,000 keywords – but with a focus on ‘money’ keywords. While my searches were long tail and in some cases, arguably more obscure, the results are much the same as the conclusions I came to, even when looking at a significantly smaller sample size.
Finding 1: Sites that rank well are more likely to be included in AI Overviews
While there are some outliers (likely due to the longtail or complexity of some of the queries I tested), the vast majority of AI Overviews are dominated by sites that were already ranking prominently. This gradually reduces the further down the page a SERP is.
The totals are as follows:
Position 1: 31
Position 2: 29
Position 3: 22
Position 4: 24
Position 5: 23
Position 6: 15
Position 7: 15
Position 8: 16
Position 9: 14
Position 10: 9
Finding 2: The cited sources are in general, serving factually correct information
When the feature first exited Beta in the US, it was plagued with issues. Lots of the results were being pulled from UGC sources such as Reddit and Quora, which gave us some absolute gems, including putting glue in cheese to make it stick to pizza better, and drinking urine to help pass kidney stones.
Finding 3: Smaller sites might get a look in – but not necessarily any traffic
For the more obscure searches, smaller sites may have an advantage if they can answer that better than more authoritative sites that rank well. But that all very much depends if anyone then bothers to click the link. Which let’s be honest, they probably won’t.
Finding 4: AI Overviews might still result in less traffic
If an AI Overview answers a query sufficiently, cited or not, people are less likely to click on the links. As this feature is relevantly new, we’ll need to monitor the impact AIOs have on traffic and conversions, particularly for sites that ranked in the top positions organically.
Finding 5: Don’t stop what you’re already doing
There’s been talk of ‘optimising for AI’. Don’t bother. The results show that for the vast majority of queries, AI Overviews cite sources that appear on page one. Not all of them, but as seoClarity highlighted in their research, this is happening 99.5% of the time!
Here’s my research, if you want to check it out yourself.