Whether you like it or not, the link profile of any site is deemed by Google to be one of the most important factors when it’s assessing a site’s quality and relevance in SERPs.
But that might be about to change.
Apparently Google might be about to ‘fundamentally restructure how its search engine goes about indexing web pages’. A team of researchers wants to replace link-profiling with an algorithm that ranks websites primarily on the relevance and factual accuracy of information on the site.
Since 2013, Google’s algorithm updates have been edging towards placing more and more emphasis on context, semantics and relevance, shifting from ‘external’ signals like link profiles to ‘internal’ signals like the content on the pages themselves. This, however, will mean a shift completely away from link profiling.
The reasoning behind this change is that currently the system for returning search results is less than perfect. Using the link profile as a primary indicator of relevance means that sometimes highly trafficked websites are rewarded – even though they contain misinformation – above sites which are more factually correct or relevant.
In order to combat this, the Google researchers are using the advances being made in machine learning to develop a new system that
‘… would measure the objective factual accuracy of a webpage rather than its subjective reputation across the web…’
How does it do this? Chris continues:
‘instead of tracking the number of incoming links to a page, the knowledge-based system would count the number of incorrect facts throughout the page and use the results to assign a score, which the Google researchers are calling the Knowledge-Based Trust score.’
In order to check facts, Google has been collecting data from all over the web, amassing a ginormous 1.6 billion facts in what has been titled the Knowledge Vault. This is the big brother of the Knowledge Graph, which was limited in where it could pull data from. The Knowledge Vault is not.
The concept behind the Knowledge Vault is two-fold
1. As sites are assessed on the number of false facts they contain, facts should be checked against information across the web,
2. As the researchers have observed that sources such as Wikipedia have essentially plateaued in the amount of correct information that they are amassing, a new system is needed in order to augment the knowledge base.
For some especially scintillating bedtime reading, the full concept and proposal can be read in the paper presented by the Google research team.
Quality, quality, quality
Despite what could be a dramatic change, all this should come as no surprise. Google’s algorithm updates have been moving towards this for the past couple of years; we’ve seen that the quality of links certainly now plays a huge role in how Google determines the credibility and relevance of a site, as does the quality of the content on that site. This proposal is just a shift further into the latter.
Luckily, we have always taken a PR-led ‘content is king’ approach to digital marketing, so this news won’t have a groundbreaking effect on what we already achieve for clients, but for some SEOs it might mean rethinking their strategy.
In addition, for SEOs and users alike, there’s a debate over whether you’re happy with Google taking further steps into controlling search; with link profiling, there is a large element of humanization – those sites and pages that fair well in search thanks to their link profile have been endorsed by us as a collective, by the sheer nature of linking and sharing that content. However, the move to this new system would mean accepting that a centralised artificial intelligence will be interpreting and ultimately deciding which sites appear in SERPs, taking very few objective, human, crowd-sourced signals into account. Feels a little bit ‘1984’ to me.
Regardless, if the research team gets their way, everyone will have to adapt.