BBC Penalised on a Granular Level
“I am a representative of the BBC site and on Saturday we got a ‘notice of detected unnatural links’. Given the BBC site is so huge, with so many independently run sub sections, with literally thousands or agents and authors, can you give us a little clue as to where we might look for these ‘unnatural links’.”
John Mueller (Google) has now confirmed that ‘granular action’ was taken on a single article on BBC’s website and this did not negatively impact the whole site.
What does this all mean?
Google doesn’t favor big brands
One of the top ‘conspiracy theories’ in the SEO community is that Google has some special preference for large brands for often overrides its main algorithm to save them from being affected. The fact that BBC received the notification means that the site wasn’t immune to algorithmic search quality filters. What others are observing with big brands is probably co-relational and not by design.
Negative SEO is not a myth
Negative SEO is in fact possible and we can see this from various statements and changes in wording on Google’s website. They now say “we’re working hard to prevent this from happening” instead of “it’s not possible”. The case with BBC only shows that not only small sites are at a mercy of deliberate malicious inbound links.
Why did this happen?
We don’t know what exactly happened but considering that BBC hardly needs more links it’s easy to assume that this is the work of a third party. For example SEO or reputation management attempt to:
- Bury a result by penalising the page
- Boost a result by inflating page’s link count
What can Google do about this?
If Google’s algorithm was able to detect bad links, it should also have been able to define a norm and figure out a deviation. Instead of taking granular action on the page, offending links should have been ignored.