Is Google Becomming Worse at Search Results?
Where Have all The Relevant Results Gone with Detailed Search Queries?
You may have noticed that about 7 years ago you could put detailed, long tail queries with a number of words and Google would return pages and pages of search results, most of these unique. After a few minutes of browsing, you would be able to locate the specific page with that information. It might be on the 11th page of results, but you would find it. AndrewDucker has written on this phenomena.
Now though, a result may yield dozens of identical articles, usually those considered most recent and ‘newsey’. Result after result about the same latest bit of news, as if the subject matter didn’t exist 5 years ago. This has been coined by a few as Google Alzheimers. The range of results is being narrowed, it is as if Google is forgetting.
Saving Energy = Worse Results
What seems to be the cause of this is Googles focus on popular content gathering and matching. And, Googles desire to expend less processing power indexing the internet, and in generating results for search result pages.
There is another reason for this narrowing, mainstreaming of the SERP. Google wants traffic to be streamlined into buckets of similar content, to which it can match adverts.
More people going to a given bucket, means more competition by advertisers to put search ads on the SERP. More competition means more bids, and higher CPC’s.
In terms of Organic search results, you will see the loss of variety and relevancy of its results for long tailed, detailed and therefore, rare search queries.
Google will ignore certain words you put in, and instead consider itself how to interpret the very general meaning of the search query. Using its machine learning, it will naturally evolve to the most likely result but trim back the range of results. It will then only respond with the most popular choices.
In the old days it worked differently. Google crawled everything that web admins told it to crawl. It thereby had many more matches to make, so the search results pages could be dozens of pages. All mostly unique results that closely match all the search key words you enter. This put the emphasis on the user to refine the results by adding more information to the query.
Google has drastically cut the processing power of making matches to search queries. And it isn’t crawling everything.
Detailed scientific writings, for example, are now much harder to find if they are older.
People who like looking for detailed information that is hard to find, have reported they often get better results from DuckDuckGo and Bing.
PPC Has Been Streamlined and Generalised Also
Mirroring this populist approach to Organic Search, Google in the last few years changed its PPC approach in the same fashion. Many long tail keywords became unbiddable. All agencies know of this, ‘insufficient search volume’ problem. Google wants to mainstream together many similar search queries/keywords in order to increase the number of keyword bids going into the auction.
The result is that it is harder to focus on the most intent driven search phrases. And, much more negative keyword sculpting and layering of ad groups is needed to get the lowest CPC’s and on the most relevant queries, control the cost per click.
Googles mixing of machine learning and language interpretation is even eroding the usefulness of exact match search queries, in which it allows considerable variations.
In the long run this is eroding some of the value of having an internet, which once shone with long tail search terms.