I have found myself praising Google Search Console a lot of the past year or so.
It has always been an extremely helpful resource and one that I would encourage any webmaster to add to their regular ports of call but I wanted to doff my hat, once again, after discovering the new Search Console URL inspection API. Whilst this will not actually change the ‘logged in’ experience of using Search Console, it is a welcome shot in the arm for any tool that uses Search Console data.
For the non geeks, ‘API’ stands for Application Programming Interface. You can think of an API as a window to 3rd party data, or a bridge that allows applications to communicate with each other. The new URL inspection API allows software applications to use Search Console data relating to specific URLs. In many cases, this will greatly enhance the value of data shown within these applications.
One such software application is the Screaming Frog crawler, which is a firm favourite of mine. The latest version (16.6 – significant enough to get its own romantic ‘Romeo’ name) adds an extremely useful drop down to the Search Console tab:
What does this drop down allow you to do?
Each of these options allows you to drill down into specific issues that may be affecting your performance in organic search.
- Clicks Above 0 : An easy one to start with – this filter will show all URLs that have had at least one click in Google’s SERPs.
- No GSC Data : This filter will show all URLs found by the Screaming Frog crawler that do not return any data from Search Console. The most likely reason for this is that the URLs did not have a single impression on Google’s SERPs, so it is worthwhile seeing if there are some important pages that need some extra SEO love.
- Non-Indexable with GSC Data : This filter will show URLs that do not return 200 status codes but which have been crawled and indexed by Google.
- URL Is Not on Google : This will show any URL that the crawler found but which is not indexed by Google. This is important as you won’t appear in the search results if the page is not indexed. This can, of course, be intentional and this list *should* include all URLs that have a ‘noindex’ tag, but it is a convenient report to identify URLs that are not appearing in Google’s search results.
- Indexable URL Not Indexed : This is similar to above, but it will exclude all those URLs that should not be indexed, so is arguably a more useful report for identifying problem pages. In essence, it will show all URLs that the Screaming Frog crawler finds and should be indexed by Google, but which are currently not.
- URL is on Google, But Has Issues : This will highlight URLs which are indexed and should be returned in Google’s search results but there are warnings attached to that URL. These warnings will include issues with mobile usability, AMP or rich results that are likely to impact performance.
- User-Declared Canonical Not Selected : This will show instances where Google is ignoring the canonical tag that you have declared and has chosen to index a different URL. Remember that canonical tags are hints rather than absolute directives and Google can get this wrong, so it is helpful to see where your instructions are being ignored.
- Page Is Not Mobile Friendly : This is fairly self explanatory – this will show all those URLs which are being flagged as having issues on mobile devices.
- AMP URL Is Invalid : If you are using Accelerated Mobile Pages (AMP), you will know that Google can be a bit precious with validation. This report will quickly highlight where problems are occurring and all the URLs which will not be featured in search results.
- Rich Result Invalid : This will show all URLs that have errors with rich result enhancements. These errors will most likely prevent rich results from showing in the Google search results, so should be investigated.
The first three filters in this list are not new, but the others have all been added thanks to the URL inspection API.
I typically like to stay in the Screaming Frog application and export specific issues, but there is a handy report that you can use to summarise all the data relating to individual URLs, which is available within the ‘Bulk Export’ menu:
This will export a spreadsheet including all URLs and display any errors / warnings that should be investigated.
What is the catch?
There isn’t really any catch – it is really helpful to have this data available to be pulled into other applications – but there is a data quota.
You are limited to 2,000 total queries per day and there is a throttle to limit the number of API queries to 600 per minute. The speed limit shouldn’t cause any problems, but the 2k daily limit is an obvious hiccup for larger sites and you will need to spread the queries over multiple days if you have a site with over two thousand URLs in Search Console. An irritation, but not the end of the world and I am most definitely pleased to see this data available in Screaming Frog.
Thank you Google for another great enhancement to the mighty Search Console. If you want to keep an eye on new features, it is worth reading https://developers.google.com/search/blog.