I was extremely pleased to read the announcement yesterday of the deeper integration of data between Google Search Console and Google Analytics.
Matt has written about the good, the bad and the ugly of Google’s Search Console data. Anyone with a website really, really should be checking in with Search Console on a regular basis, although it is certainly not perfect. Not only is some of the data a little suspect, but there has always been a disconnect between the visibility / average rank reports and what that traffic actually does once on the site, as reported in Google Analytics.
The announcement yesterday stated:
“Today, we are introducing the ability to display Search Console metrics alongside Google Analytics metrics, in the same reports, side by side – giving you a full view of how your site shows up and performs in organic search results.
Crucially, it goes on to say:
“But to gain a fuller picture of your website’s performance in organic search, it’s beneficial to see how visitors reached your site and what they did once they got there.
I have not yet seen the new report, which is being rolled out over the next few weeks, but a preview is included in the Webmasters Central blog post:
This is huge news and will be welcomed with the widest of open arms to digital marketers IF it delivers on the promise of marrying up keyword acquisition data with on-site behaviour. This data has been missing since searches were encrypted.
The big question is WHY has this been released?
In essence, we are simply being given back the data that we always used to have (and love). Remember that Google took keyword data away to provide “extra protection” for searchers. I will resist the urge to be cynical about this (the data was still provided for paid search ads) but ask why Google now feels that this protection is no longer required?
Is it an olive branch to the digital marketing community and an acceptance that the data can help improve the experience for all users or is there something else afoot?
The march of the penguin?
I am hoping that the new reports are actually a sign that Google Penguin 4.0 is about to be unleashed on the digital world.
I should point out that this is conjecture on my part and I have absolutely no evidence to support the theory. It could just be Friday fantasy.
What I do think is abundantly clear, however, is that the real-time Penguin algorithm update has been proving to be a huge headache for Google. It was cited for launch in 2015, so something has been holding it up.
Personally, I can’t wait for Penguin 4.0 as I believe it will end a period of stagnations in the SERPs, where spammy sites are still doing well in far too many instances. I can only conclude that the delay is reflective of the fact that it is difficult to get right and naturally needs a decent period of testing.
I also expect that it is going to be a big change. I think that this could be one of the more volatile updates in recent years and my theory is that Google is being forced into giving more search performance data to placate the storm that will inevitably rage if the update is as volatile as I suspect.
By giving more transparency about visibility / rankings and associated user behaviour once on a website, it will become much easier to diagnose issues. Google could, therefore, be making it easier to understand the impact of Penguin 4.0 and thereby reducing the wrath of the inevitable losers when the algorithm is updated.
What do you think? Fantasy or (possible) fact? Are the new reports the contractions before the birth of a penguin, or has it got absolutely nothing to do with it and we should just thank Google for joining the dots?