Panda, Penguin and Hummingbird Algorithms

Google Algorithms

Algorithms are used by Google and other search engines to help those looking for something specifically have a greater success of finding that searched item. For example, if you were to put in the phrase, “Ways to organize my closet,” Google would then use their more than 200 SEO factors or signals built into their algorithm to pull up results that match the criteria of this phrase.

Within the past few years, Google has added new algorithms into their search engine for the purpose of producing high quality results for the searches we do on the search engine. If you are new to these algorithms, this article is just for you.

Panda: Quality control

Panda was the first of these three that was launched by Google with its appearance on the scene in February 2011. With its launch, Google had great plans for this algorithm because it was designed to reduce and/or eliminate websites and pages with low quality content from making it to the top rankings when some of its keywords were searched by someone. The basic use for this is to filter quality content also called Google’s Quality Raters, which evaluate these websites and pages based on a specific set of questions that determine their overall quality.

The major drawback to this algorithm is that some of the high quality content of several websites was considered low quality and the once top ranked websites have plummeted towards the back of the results when a search has been made on their keywords. This has caused some major upsets because these high quality sites and pages have been replaced by those that could be in violation of copyright laws.

In response to these upsets in rankings for those with high quality content that’s being labeled by Panda as low quality, Google has suggested that these sites and pages work on rewriting the content to make it more original and new or by removing those pages with the poor quality rating given by Panda. By utilizing one of these two options, Google has given these once top ranking sites hope of seeing their rankings rising back towards the top.

These recovery methods can be useful or hurtful depending on the site that got hit by Panda. If new, better content can’t be produced, the option come down to one for the site in order to recover their site’s ranking as one of the top search results.

Penguin: Off page factors

Penguin was introduced by Google in April 2012 as an update algorithm. Google released this algorithm in response to violators of Google’s Webmaster Guidelines by decreasing these website rankings. Most of these violators have used link schemes in order to create the image of having numerous good links that direct searchers to their page when in reality they don’t.

For the most part, these type of sites are those used by excessive spammers that overwhelm search pages whether they are relevant or not to a person’s search. Further updates to this algorithm was supposed to fix the rate of spammers they catch while ignoring the non-scammers, leaving their rankings intact.

However, this isn’t always the case for some sites. There have been sites that implemented weak links to them from other sources or the wrong links altogether which resulted in gaining attention from the Penguin algorithm as a violator of the guidelines. Once Penguin found this site, it then strips the site of its current ranking and send it sailing to the bottom of the list.

When this happens, there isn’t much recourse for these sites unless they take the necessary steps to remove the Penguin’s rating of their site as a violator. This can be done mainly through the removal of these bad links and the weak links from their site or by fixing them with stronger links that will stand out as good ones for the site or page.

When it comes to Penguin, the more good links that a site can put in place for themselves, the more likely they can recover their previous ranking and see the Penguin rating removed from their page or site.

If you see an error on Google Webmaster tools that requires manual action, that usually happens when this algorithm found unnatural links to or from your site. If this happens, you need to disavow those backlinks immediately and send reconsideration request.

For more information on how to request reconsideration, check this search console help page.

Hummingbird: On page factors

The newest of the three algorithms is the Hummingbird, which was launched in August 2013. This algorithm differs greatly from the other two because of Google’s attempt to make their searches more human in nature. They wanted this algorithm to be the epitome for their searches by having this algorithm think about the searchers reasons for their search instead of just producing the facts of the search based on a few keywords.

When it comes to this algorithm, it doesn’t focus on just a few keywords within the phrase put into the search bar, but takes into account for the whole phrase with priority on answering the five questions:

how, what, where, when, and why.

Hummingbird is meant to give searchers the entire picture instead of just pieces when they go into Google’s search bar and type in their queries.

What this means for some websites and pages is that based on this “thinking” done by Hummingbird, their sites or pages can be overlooked because the algorithm sees them as facts instead of entire pictures for the searcher. With the introduction of SEO to help these websites become more noticeable, most websites adopted these SEO keywords so they could be found easier. However, with Hummingbird, these SEO keywords may not be enough to get the notice these sites and pages had prior to the algorithm’s release.

This doesn’t mean that there isn’t any recourse for these websites and pages, however. If they are willing to look at their SEO keywords and tweak them to give a better picture of their subject’s content, they can stand a greater chance of showing these sites at higher ranking levels than if nothing is done. Working around the Hummingbird algorithm can take some work, but that work can pay off for those sites with high quality content that answers to the whole picture of a search instead of just the facts.

Conclusion

With each of these three algorithms, Google has attempted to help those sites that offer high quality content stand out among all the rest as well as give their searchers to find sites that would interest them in their search for answers to their questions. Google friendly sites are the sites that offers content based on user queries rather than search bots.

Have there been some pitfalls with each of these algorithms that has effected high quality websites and pages? Yes, but there are ways to regain the former rankings on some of the top results if these sites and pages are willing to do the work to make their sites the best they can be or in other words more user-oriented.

As Google continues to develop these algorithms to keep scammers and poor quality sites at the bottom of the rankings, more problems are sure to come for those that are working hard to provide the exact opposite when it comes to quality. However, there will be ways to recover from the pitfalls of any new algorithms created by Google and other search engines.

Hi there! Leave a reply