How Google’s Ranking Policies Motivate the Creation of Junk
High rankings bring traffic and traffic often translates into money.
Today, Google is in the position of deciding where a lot of eyeballs end up. Many website owners are tempted to try to beat the system, and by becoming the dominant search engine, Google has created a monster it has had to battle for a decade.
In the early days of search engines, it was mainly website content that drove search engine rankings. In short, the site that had the most content about a keyword tended to rank at the top for that keyword. The result was an explosion of web spam; web pages full of low-quality copy stuffed with the target keyword because it helped improve search rankings.
Using inbound links to determine a site’s search engine rank made it harder to game the system, until Web 2.0 and the birth of user-generated content on other people’s websites. Soon, article directories were filling up with garbage and the vast majority of blog comments added nothing to the conversation other than “Nice post!!” and a signature link with a keyword in the anchor text. You get the picture.
With its Panda updates, Google tried to remove the profit motive for generating junk content. And largely, it has worked. Using a variety of signals, including user behavior, Google has been able to sift the wheat from the online chaff and relegate the vast majority of junk to the harmless backwaters of the Internet.
In addition, by applying ranking penalties to websites with junk content, website owners have been motivated by both a carrot and a stick to improve the quality of their website and user experience. Not surprisingly, the explosion of garbage copy, which once threatened to overwhelm the Web, has subsided. Google should be commended for enforcing fairness to the process — after all, a website owner is solely responsible for the content on their site, except for the rare instance when the site gets hacked.
More recently, Google’s Penguin updates have attempted to level the playing field in the area of in-bound links as ranking signals. Penguin can detect links that appear contrived and downplay or ignore their effect on search engine rank. Again, Google is trying to take away the reward for overzealous optimization that would create links having no value to search engine users, and should be commended for it.
Google is using a carrot and stick model with Penguin as well, but in this instance it has produced an unintended consequence. You see, when you penalize a website for having crappy content and a lousy user experience, there is no collateral damage. Only the bad website gets the penalty. But unlike website content, which requires password-protected access, anybody can create links to another website. In fact, creating junk backlinks for a competitor is relatively cheap and easy, certainly requiring less effort than producing high-quality content and links.
Known as negative SEO, this kind of activity can be difficult to detect and almost impossible to defend against. The first sign of trouble is an unexplained drop in organic search engine traffic.
While we applaud Google for figuring out how to devalue junk backlinks, we believe they should not be penalizing sites for something that may be outside of their control. By applying penalties, Google is creating a motive for over-competitive website owners to try to gain search engine rank at the expense of their competitors. And, as we’ve learned over the years, wherever there is a financial motive to create junk, it doesn’t take long for people to fill the void.