Once upon a time, the internet was a great place where you could do anything and get away with it. As the internet grew, so did social networking sites. There was a large amount of information, so much so that it was difficult to discern truth from reality. As people arbitrarily continued to dump information on the internet, regardless of relevance, Google stepped in and began to throw its weight around in an effort to ensure search engine results quality. This became necessary mainly because Google was losing money to paid SEO companies who had figured out how to game the system so that their clients always came to the top of the search results. Also, there were companies who copied work from other websites through APIs and automated algorithms, creating very low-quality websites with nothing but copied material.
So in this article I discuss the different software algorithms Google added to its search rankings in order to ensure its search results were more accurate.
This program is named after Naveneet Panda, the Google engineer who developed the technology behind the algorithm that was first released in February 2011. Google Panda is aimed at delisting websites that use low-quality content. If a website fails the Panda test, it will be removed from the search results or penalized signficantly.
Any websites who were blatantly copying content from other websites were automatically blacklisted. Websites who indulged in keyword stuffing were also penalized. Other victims to fall prey to the Panda Penalty were websites with excessive advertisements. However, following a wave of “scrapers” (websites who use content copied from other websites), Google was compelled to issue a list of 23 bullet points in a bid to define a high quality site.
This is the codename for an algorithm that was launched in April 2012. Google Penguin emerged from the Panda algorithm, which was more limited in scope than its successor. And both algorithms still operate together. Panda being more strict (penalty or no penalty), whereas Penguin is designed to influence rankings on a smaller scale. For example, Penguin can influence where in the rankings your page is displayed, such as position 1 verses position 5. Google Penguin was designed to weed out websites who failed to follow and who violated Google’s “Webmaster Guidelines”. These guidelines have been very explicitly given, and are the basis of good SEO.
According to journalist and web editor Steve Masters, “The Hummingbird approach should be inspirational to anyone managing and planning content — if you aren’t already thinking like Hummingbird, you should be. In a nutshell, think about why people are looking for something rather than what they are looking for. A content strategy should be designed to answer their needs, not just provide them with facts.” This is a fitting description of what Google Hummingbird is designed to do. Google began to use Hummingbird from August 2013, but the official announcement of the change came only in September of the same year. The main objective of Google Hummingbird is to create a more organized system for indexing of information. Like the previous campaigns, Google Hummingbird also aims at improving the quality of content that gets posted on the internet. The main focus is on context of content and the meaning of entire sentences and conversations, rather than analysis of single words or phrases.
Some top ranking sites experienced a revision of page rank based on the flow of natural content on their websites. Hence content which was organized on the basis of “Who, Why, Where, and How” were the ones to benefit from this change, and extensive use of synonyms also helped in improving the SEO of these sites. For example, if you search for “colleges”, you might get some results that include the synonym “universities”. “St. Patty’s Day” will return results for “St. Patrick’s Day”. Hummingbird helps Google better understand the inputs from the user to convey the value of the output even if the word isn’t an exact match keyword.
In July 2014, the Google Pigeon algorithm was launched. The algorithm is supposed to bridge the gap between general, worldwide search results and local search results. It also addresses a few other issues regarding shortcomings that had been previously identified in the way Google presents search results. The focus of this algorithm is to increase the rank of local listings in a search. According to Google, the Pigeon algorithm has created “closer ties between the local algorithm and core algorithm(s).” Google Pigeon is also a response to complaints by Yelp that even when a search was done specifically for Yelp’s reviews, Google’s reviews were being highlighted instead.
Google Maps results appear and disappear depending on slight adjustments in addresses. Several websites have had their rankings drop or totally disappear, and then suddenly pop back. However, the general consensus seems to be reflected by Author Neil Patel of Search Engine Land who says, “Those of us who travel to San Francisco, talk to colleagues in Tel Aviv, tweak a client’s website in London, and teleconference with service providers in Perth and New York still need to grasp the importance of local search.”
If a company based out of Virginia makes a website that includes over 100 pages with each page being a different city in America, Pigeon will not rank those different city pages unless the company has a local presence (address) in each of those cities. Local results are designed for people who want to drive or call a company or organization in their area. Local results are not for national companies trying to pretend to be local when they are not.
The common objective of all four algorithms rolled out by Google over the last five years is to improve the quality of search results, and to give relevance to high-quality and relevant content. Each campaign launched by Google had some specific objectives, which distinguished one from another. Let us consider the difference between all four campaigns by Google:
- Google Panda was the first major overhaul since the launch of the search engine, and one of the most comprehensive – even now, in 2015, there are websites who are struggling to recover and remedy the Panda penalty imposed on them.
- The later algorithms released by Google were more focused, each in its own specific way. Initially, it was difficult to identify Google violations under the Panda regime. However, with the release of Google’s 23 bullet points and its Webmaster Guidelines, it became easier to identify and rectify the deviations of various websites.
- Google Penguin targeted Black Hat operators who were using unscrupulous and unfair practices to achieve good page rankings, by violating the Google Webmaster Guidelines, and more specifically those who practiced spamdexing and link bombing.
- The goals of Google Hummingbird were more subtle but far-reaching, with an aim to improve the quality of content by changing the way an algorithm detects the meanings of words and conversational search.
- With Pigeon, Google is attempting to give more importance to local search results, which hopefully should give rise to more effective local Google searches for users.
As an SEO person myself, I have received these algorithms with mixed reviews. While we enjoy the quality of a lot of the search results now, often great content is buried because it just does not have a chance to break through all of these algorithms. Many SEOs think that Google is simply doing this to ensure money is spent on its main revenue generator – Google Adwords. Google Adwords is Google’s pay-to-play system that displays ads on the search page and on websites.
Regardless of the challenges, SEO is still alive and well today. It centers around having quality content, brand awareness, and starting conversations.
Latest posts by Spence Rogers (see all)
- 8 Ways to Use Pinterest to Grow your Brand - September 3, 2016
- Google Panda, Penguin, Hummingbird, and Pigeon Differences - May 9, 2016
- 10 Helpful WordPress Plugins - September 9, 2015