Google Search Is Getting Worse

Is Google Search getting worse, or is the web degrading?

A recent trend in Google Search has users appending “Reddit” to their queries with the hope that they’ll receive a thoughtful answer written by a human – just one of many attempts to treat the underlying (and growing) problem: queries on Google Search are not being answered.  

Users of the most popular website in the world are noting an uptick of garbage search results. Some links are simply irrelevant, but an increasing quantity send searchers to sites with confusing AI-generated content designed to maximize exposure on search engines.

(Note: if you’re struggling with fickle online algorithms like Google’s, get in touch and we’ll tell you how we can help!)

SEO Content by AI, for AI 

These articles scan as legible from a distance, but are actually meaningless & factually incorrect rehashes of data created by machine learning program. Written by machines and for machines (Google’s crawler), they’ve got terrific SEO: length, reading difficulty, keyword use, and formatting are all up to the height of Google’s standards.  

Unfortunately, though the grammar might be correct, the parts of these pages that aren’t direct plagiarism are essentially word salad. Searching for “the best 120 format film camera,” I came across this breakdown from (a site that no longer exists), which begins with this introduction to the subject:  

“One of the best ways to detect and document high-quality content is with a camera. They are usually small in size, color and can be found anywhere.” 

Nonsense pages like this will set off alarm bells for any human who reads it carefully, but for a web crawler, it scans not just as legible and informative but as far more optimized than the majority of human-penned pages. 

Search Results Worsen as the Internet Degrades 

The gamification of Google’s algorithm by bad actors to place higher in the search results is nothing new, but the severity of the current trend is leading some to wonder about causes, solutions, and potential competition. 

Marissa Mayer, tech executive and former CEO of Yahoo, posits that Google’s algorithm is showing users an accurate cross-section of an increasingly spammy pool of web pages – in other words, Google’s system is functioning correctly.  

Google’s decline, then, is a result of smartly manufactured spam content butting heads with an insufficiently advanced algorithm. Activity by both Google and its users support this theory: 

Poor Quality Sites Encourage Google to Keep Users On-Page 

In the above interview, Mayer makes the interesting point that the declining quality of the greater web encourages Google to keep users on-page. You’ve probably noticed them doing so already by using techniques like structured data to answer queries before the user clicks a link (or even finishes typing).   

 So if you want your site appearing in results, simplicity and organization are more important than ever… but does it really matter if nobody clicks your link?  

Users Append Trusted Sites to Google Search Queries 

Early last year, an article in Boingboing suggested users remove inauthentic content from their Google search results by adding “Reddit” to their queries. Doing so aims to serve users a conversation between human beings rather than a machine learning regurgitation. 

Since then, the lack of quality results on Google has only become more dire, and searches for Reddit have increased significantly. If Reddit had a more functional search system, these users might not be on Google at all. 

Black Hat Affiliate Marketing  

Some of these inauthentic sites make money by showing copious ads, resulting in a spammy feel that’ll be immediately familiar to any long-time internet user. However, some have taken the slightly more insidious approach of showing no image ads but including Amazon affiliate links for each mentioned product. 

Of course, authentic sites like The Strategist are abundant with affiliate links. For this reason, we suggest users learn to identify crawler-bait quickly, which leads us to our next point: 

Bounce Rate and Time-on-Page Insights 

A favorite search engine metric of a site’s success is its session length (the amount of time a user remains on the site after arriving). Juxtaposed with the site’s bounce rate, or percentage of users who land on a page and immediately flee, this is a powerful indicator of a site’s ability to provide useful information and satisfy the user’s query. 

For purveyors of spam, the goal becomes keeping users on the page as long as possible without doing the work necessary to provide that useful information. The most popular way to do this is to frontload the page with a meandering introduction, which must be long enough to bump up the site’s average session duration and vaguely on-topic enough to convince users the thing they’re looking for is coming soon. 

If you’ve ever looked for a recipe and landed on a multi-paragraph story about someone’s beloved ancestors, now you know why (humans use this technique too). Once you’re already on the page, the easiest way to identify this is to focus on a short snippet of text. It will either be relevant and comprehensible – and you can skip ahead to the recipe – or not. 

However, it’s also a good practice to filter for these results during the initial search by learning the names of popular and dependable sites for regularly searched topics. 

Taking Advantage of AI’s Strengths 

How are crawlers meant to distinguish between authentic and inauthentic webpages? At the moment, their algorithms appear too unsophisticated, and there’s been no indication that AI is up to the task. 

The outcome might be search engines abandoning their fine-tuned algorithms, which they’ve had to update each time a new clickbait technique arises. Instead, they’ll take advantage of what AI can do well – absorb information and produce intelligent-seeming prose. 

Blaming Google Isn’t Completely Fair 

I’m inclined to agree with Marissa Meyer that the quality of the web is degrading overall; anecdotally, I searched “best 120 format film camera” on Bing and the first result was a (different) AI-generated Amazon affiliate listicle. 

Whatever its cause, the current situation leaves Google unusually vulnerable to competitors with novel approaches to the problem. We’ll discuss these more in our next post. 

Search engine marketing and optimization is as complex and frustrating as it has ever been – get in touch and learn how we can help you boost your business’s profile.

About The Author

Share This