Latest posts by Winston Ong (see all)
- Common form engagement tracking techniques with custom variables in Google Tag Manager - December 5, 2018
- What exactly is a doorway page anyway? - November 30, 2018
- Are you completely misreading time on site in Google Analytics? - July 11, 2016
In April 2012 Google introduced the “Penguin” algorithm to combat the link spam that has been polluting the web since people realised “more links = higher rankings”.
This is the kind of crap Penguin is designed to penalise:
Now, instead of “more links = higher rankings”, the equation has potentially changed to “bad links = lower rankings”. This is a good thing, right? The ultimate goal of Google’s search team and webspam team is to maximise the search engine user experience. Webmasters, content publisher and their overpaid SEO advisers building artificial links to a site mean that less relevant/useful content may rank above content that truly deserves to be at the top.
But a link penalty should be distinguished from a link neutralisation. The common understanding of Penguin is that a webpage will actually drop in rankings farther than it would in the absence of spammy links.
The problem with penalising bad links is that they have bred a cottage industry of consultants offering ‘negative SEO’ services to spam competitor sites with poor links.
In theory, Google’s link disavow tool provides a counter to this strategy for victimised webmasters. Of course, this is an imperfect solution as site owners range from the SEO-savvy to the SEO-clueless, who have no idea how to correctly disavow spam links. Google themselves even caution:
So you could be penalised for even attempting to do the right thing!
Further, let’s say we do have a textbook scenario of an SEO-savvy webmaster detecting a targeted link attack, identifying the bad links (not an easy feat, since Google gives you incomplete link data in webmaster tools and only partial clues via unnatural links warnings), and correctly disavowing the links according to Google’s policies, it may be several months until an innocent site with relevant content is indexed to its former position.
Is this fair? No.
Does ‘fairness’ matter to Google? No.
But content relevancy does, and link penalties have the very real (and I would say not uncommon) risk of penalising good content.
Even if a webmaster themselves have engaged in artificial link spam on their own website, I still argue that this is no sufficient justification for Google to apply to penalty. Firstly, the site owner may have engaged an SEO company to build links on their behalf without knowledge of the nuances of white hat link building. Secondly (and more importantly), the site may actually have highly relevant, quality unique content regardless of bad inbound links. Does a search user’s experience benefit in this scenario? Of course not. When you search Google, you see the on-site content, not the off-page SEO and link building that has gone into a website.
The original (and correct) role of inbound links as one of the most important search engine ranking signals is to act as a quantifiable indicator of what is quality content.
This signal should only work in a positive direction – a website should be able to benefit from good links, but not be penalised by bad links. Search users do not (and should not) care about inbound external links to a website, and thus penalising websites with relevant high quality content will only negatively affect the search user’s experience.