Google Algorithms’ Cheat Sheet

Google Algorithms’ Cheat Sheet

Part I. Google Algorithms’ Cheat Sheet: Panda, Penguin and Hummingbird

In this article we will discuss Google filters and their tasks and functions in a language that is both simple and available to anyone. This will help to better understand the search engine’s principle of website monitoring and to eliminate flaws that may cause penalties or even a permanent website block.

What Does The Change Of Algorithm Mean?

Let’s begin with determining an algorithm as an extremely complicated system created by Google in order to provide its users with the most quality and useful information, which keeps getting more and more complicated.

During the early ages of search engines marketing specialists had little problems driving their websites to good positions. Sometimes simply adding meta-tag codes, designed to tell search engines about the nature of the website, would do the trick.

As Google progressed, engineers largely concentrated on making search engine results as relevant to users’ requests as possible. In order to achieve this they developed new methods that forced webmasters to be more honest with their websites’ visitors. In particular, internet sources’ ranking systems were constantly improved. Modern algorithms feature hundreds of factors that define each website’s place in search results. Some of them are really important, such as a good descriptive title (between the <title> … </title> tags within the page code). And some of them are literally the subject of speculation, such as “Google +1”, for example.

In the past it was rare for Google algorithms to change. If your website was a top of the list under a specific keyword, you could be sure that its position won’t change until the next algorithms’ review, which could have happened weeks or months later. After the update the situation would change, however it would go back to stable for quite awhile soon afterwards.

Everything has changed since the “Caffeine” was launched in 2010. After its release, search results had begun to change several times per day. Overall, Google makes around 600 amendments to its algorithms throughout a year, and most of those changes go unannounced. However the most noticeable changes are not only widely published, but also receive sound and memorable titles. These innovations immediately become sensations among the SEO world, forcing thousands of marketers around the world to rack their brains about “taming” them.

Notable examples of such innovations are such Google filters as Panda, Penguin and Hummingbird, which had survived numerous corrections and updates throughout their existence, each of which was well felt by webmasters and marketers around the world.

What Is The Panda Algorithm?

Panda was launched on February 23, 2011. Its purpose was to help higher quality websites gain rating points and, therefore, decrease the less respectable sources. This algorithm didn’t initially have a name, therefore it was called “the Farm” in SEO professional circles due to its obvious orientation on the work of content farms. A content farm is a website that gathers (and often steals) information from other web sources and creates texts customized to keywords. The only purpose of such content is to promote the website that bought such texts in search results. However, the algorithm resulted in affecting numerous different websites, and some time later it was named after one of its creators, Navneet Panda.

When Panda first appeared, many SEO professionals decided that this algorithm will concentrate on websites with unnatural backlinks. However it was later revealed that Panda prefers to assess the quality level of the content represented on websites.

Websites that suffered from Panda usually found themselves in quite a complicated situation. However, some were able to avoid significant losses. The algorithm principle is that it declines the rating of a website as a whole rather than its particular pages. However sometimes penalties only affect a specific section of the website, such as the news block or a specific subdomain.

To determine the outlook of your website in Panda’s eyes, take a close look at each of its pages and answer the following questions truthfully:

1. Would you believe the information provided on the website yourself?

2. Is every text written by an expert or an enthusiast that know the topic, or is it not so great overall?

3. Are there duplicates, overlays or extensive articles on similar or same topic with several keywords options?

4. Would you be willing to leave your credit card information on this website?

5. Do articles have spelling, style or factual errors?

6. Does the website reflect on actual interests of website readers, or did the website generate content trying to guess what can score high in search engines?

7. Do posts feature original content or information, original statistical data and researches, or original analysis?

8. How valuable does each page look in comparison with other search results?

9. How carefully is the quality control of the website content?

10. Does the content provide an all-around understanding of the issue?

11. Does the website constitute acknowledged expertise to its niche?

12. Did users’ interest declined towards specific source pages due to the distribution of content featured thereon through the wide website network or otherwise?

13. Were the texts edited by a professional?

14. If the website would feature issues related to health, how likely is that you would trust this information?

15. How does the website title helps you to relate to it as to the reliable source?

16. Does the article feature an in-detail problem/issue description or a comprehensive answer to the question?

17. Does the article feature an in-depth analysis or interesting facts that are beyond obvious?

18. Did you want to add the page to favorites or to recommend it to your friends?

19. Does the article contain too many announcements that draw attention away from the main topic?

20. Can you imagine this text as an article in a printed magazine, an encyclopedia, or a book?

21. Are articles too short, little informative, useless?

22. Does content pay sufficient attention to details?

23. Will users complain about the website content?

Questions listed above do not necessarily mean that Google will try to algorithmically determine if your website content is interesting and if it provides all-around description to the issue. Nobody can properly list factors that affect Panda’s assessment of the website. However, all points listed above may have a negative or a positive effect on the assessment of your website, since Google only indexes the best content.

Nota Bene!

As for Panda in particular, there are several key features one shall pay very close attention to in order to avoid being penalized by the system:

  • “Poor” content. What we mean is that the website content is either secondary, or not sufficiently interesting and informative to users. At that, there is no need to post very large texts to be successful, while the content consisting of a couple of sentences will probably not be that useful either. So if Panda finds a number of pages indexed by Google that have 1 or 2 sentences on them, it will probably treat them as poor quality pages. If you only have a few of such pages, you should be safe, but if your entire website is filled this way, you’re doomed for trouble.

  • Non-unique content. Panda pays close attention that content posted on websites is not copied by other sources. Each website must ideally have “individual” content.

  • Duplicate content. That happens when you copy content on several pages within your website. This often happens to e-shops or similar sources that sell something. If they have a large assortment of similar products (say, only different in color, size, etc.), their descriptions will probably be the same. And the number of identical pages can be as high as several hundreds! Make sure to use the “canonical” tag to avoid this situation.

  • Low quality content. In the past, SEO professionals recommended website owners to post new information daily and to pay close attention that content would be always indexed by Google. However, if your content is of somewhat poor quality, you will cause more trouble than good by such behavior. Let’s say, you have been posting notes on articles read for several years, which are naturally less valuable than the original articles. As a result you risk to end up with several thousands of little value pages that Panda will treat as a serious flaw.

How To Recover After The Panda Filter?

Google updates the Panda algorithm around once a month. While the company used to notify its users each time such update was about to happen, nowadays it only happens in case of particularly significant and large-scale updates.

After each update, the search engine starts analyzing websites taking into account amendments made. If you had modified your website after receiving algorithm’s notifications (for example, got rid of duplicate content), you shall notice that Google starts treating your website better after the new update (and therefore as soon as the new system scan takes place). However, sometimes it may take up to 2 or 3 months to see positive changes.

From time to time Google makes an update that changes the criteria for website quality evaluation rather than to just refresh the algorithm. On May 20, 2014 the search engine had released a significant algorithm update named Panda 4.0. Those changes affected numerous websites. However such global changes don’t happen very often, therefore in order to overcome Panda penalties it is usually enough to seriously improve the quality of your website content.

What Is The Penguin Algorithm?

This algorithm originated on April 24, 2012. It’s primary purpose was to detect websites that publish unnatural referral links to improve their position in search results. Besides, algorithm also paid attention to some other factors that evidence poor quality of a website.

Why Are Links So Important?

If an acknowledged website refers to your site, it looks like a recommendation for the latter. If the link is issued by a small website that is not so significant, it is not as prestigious, however large number of such links can significantly improve your images in system’s opinion. Therefore SEO-optimizers used to hunt for links from all possible sources in the past.

Second most important thing for algorithms is your anchor text (specified in the link). Let’s say, a link to your website contains anchor text such as “SEO Blog”. If a number of other websites will reference to your site with such anchor text, this helps Google to understand that this website may be interesting to people that look for “SEO Blog” in their search queries.

It’s quite obvious how one can manipulate such algorithm. Let’s say you open a website for a company that specializes on landscaping services in Orlando. One way to cheat the algorithm would be to create a large number of artificial links with the anchor link like “Orlando landscaping services”.

We don’t know, which particular factors Penguin pays attention to, but we can say for sure that it evaluates “reliability” of links. And if it sees lots of unreliable back links on at least one page of the website, algorithm can decrease the rating for that particular page or for the whole section or even the entire website.

As for the severity of penalties imposed, the result may be different in each particular case. Some sources suffer significant damages, while others only face minor inconveniences. It all depends on the number of artificial links activated on the website.

How To Recover After The Penguin Filter?

Like Panda, the Penguin algorithm is updated from time to time, therefore criteria for assessment of websites’ quality are reviewed from time to time.

To win back the trust of the search engine, you shall determine unnatural links referencing to your site and either delete them or (if it is impossible) ask Google to deactivate them. However, remember that it is important to double check that you provide right links, or you may do more harm than good to your website.

After the next update, Penguin will analyze your website once again, and if you made proper changes, your website will be “back from the dead”. But please note that sometimes updates are only released once in half a year or even less frequently. Besides, restoration of trust to your website doesn’t necessarily mean that you’ll be back to your past leading positions in search results, as your past rating was received with the use of unnatural links that are no longer available.

What Is The Hummingbird Algorithm?

This algorithm is significantly different from its “colleagues” discussed above.

It is interesting that most numerous claims of website owners telling that Hummingbird decreased their websites’ rankings to nothingness are not always true. This is not usually the case.

Google had announced the new algorithm on September 26, 2013, while Hummingbird had already been running for about a month. And if it was responsible for catastrophic ranking volatility, the SEO world would had noticed it back in August 2013. However that never happened. If you do suspect that Hummingbird is guilty of your problems, it would make sense to take a close look at you traffic say in October 2013. If you see a big decline on the 4th, you can safely blame it all on Penguin that was updated that day.

Hummingbird actually represents major restructuring of Google’s approach to algorithms. According to Danny Sulliwan, editor of the popular magazine “Search Engine Land”, if you consider the Google algorithm as an engine, Panda and Penguin would be merely spare or supporting parts (such as a filter or a fuel pump), while Hummingbird is actually a brand new engine itself. Naturally it has some features similar to Panda and Penguin, however speaking globally it’s something totally new.

The purpose of the Hummingbird algorithm is to analyze user requests. Fore example, if a user asks “Where to eat deep dish pizza in Chicago”, Hummingbird understands that a user will be interested in search results related to “restaurants” category. There us even a suggestion that this algorithm was created in order to increase the efficiency of voice search in the system. When we make a search query in writing, we’re likely to enter something like “Best SEO company Seattle”, while if we say a request out loud, it would probably sound like “Which Seattle firms provide the best SEO services?” Hummingbird can help the search engine precisely what users meant when entering such queries.

How To Recover After The Hummingbird Filter?

Based on the above, you simply need to create such content on your website that helps users to find answers to specific questions. It will have qualitative difference from content that is created in order to artificially improve your site rankings.

However there is another important point to consider here. As it was already noted, Hummingbird is significantly different from Panda and Penguin. And if you improve your website, you may soon notice that other algorithms start treating you better, it won’t be the same with Hummingbird. Once this “tiny bird” dislikes you, you will not be able to return to similar levels of website relevancy to search users’ requests. The only way is to find a different opportunity to prove to Google that your website can be quality and interesting.

Part II. History Of Google Algorithm Updates

Annually Google releases about 500 to 600 changes to its search algorithms. Alongside with minor novices, fro time to time it releases global updates of its analysis parameters (such algorithms as Panda, Penguin and Hummingbird were among such landmark updates) that significantly affect the SEO world.

When you know the update dates, you can explain changes in rankings and traffic numbers of your website throughout the history. This in turn may help you to understand how Google algorithms and filters work, thus improving your site’s search engine optimization.

Below we provide the list of major changes in settings and indexing parameters of the Google search enging throughout its 15-year history.

2015

  • Panda 4.2 (#28) — June 17, 2015. Google announced the new Panda update, which may require several months to put together. No details of specific novelties or how the “refreshed” algorithm will affect websites have been released so far.
  • Quality update – May 3, 2015. After numerous messages about large scale ranking changes initially called “Phantom 2”, Google made a significant change to its algorithm that affects “quality signals”. Apparently, this update had a significant effect, but Google provided no details on peculiarities of those signals.
  • Update for mobile devices “Mobilegeddon” – April 22, 2015. Google announced plans to change website ranking for those supporting versions for mobile phones and other personal gadgets. Initial effect of this novelty apparently was not as significant as it was expected.
  • Untitled update – February 4, 2015. Webmasters around the world started talking about the wave of significant changes in Google search results. At that, the company itself had never confirmed the renewal of its algorithms.

2014

  • “Pigeon” Release — December 22, 2014. Google made significant local updates for United Kingdom, Canada and Australia. Original update had “hit” the United States in July, 2014. The nature of the change is that system will look for search results that are closer to your location. For example, if you’re looking for a “restaurant”, Google will provide you with the list of restaurants from your city or country among the top search results.
  • Penguin Everflux – December 10, 2014. While Penguin was previously known for rare and mostly global updates, beginning on this date its parameters are updated more frequently and regularly.
  • Pirate 2.0 – October 21, 2014. It took Google over two years to launch another update of the company’s anti-pirating software that helps fighting content violating copyrights after the original release of the DMCA/Pirate. Release is very powerful as it is capable to reduce related websites to critically low rankings.
  • Penguin 3.0 — October 17, 2014. It was more than a year since the past Penguin release (Penguin 2.1) that Google issued that updated algorithm, which took several weeks to activate. Despite forecasts, Penguin 3.0 affected less than 1% of English language sources. St that it mostly affected software settings of this analytical tool.
  • “In the News” block – October, 2014. Some changes were made to the presentation of the news block. Now it includes links to a much larger list of potentially interesting news portals.
  • Panda 4.1 (#27) – September 23, 2014. Google announced a significant update of one of its major algorithms, which included the algorithm component. Update had affected no less than 3-5% of questionable search results.
  • Deleting the author snippet – August 28, 2014. Google excluded the author’s component from search engine results.
  • HTTPS / SSL Update – August 6, 2014. Since this day the search engine started to give preference to websites using secured HTTP.
  • Pigeon – July 24, 2014. From now on, search engine results take into account geographical location of each particular user. This update had shaken the entire SEO world.
  • Deleting author tags from photos – June 28, 2014. John Mueller announced, that from now on Google will reduce ranking of SERP photos, tagged to the author.
  • Payday Loan 3.0 – June 12, 2014. According to some sources, this updated released in just a month since the previous one had addressed targeted spam requests.
  • Panda 4.0 (#26) – May 19, 2014. Major update that affected software data as well as algorithm itself. According to official data, it affected over 7.5% of English-language requests.
  • Payday Loan 2.0 – May 16, 2014. Google presents a new algorithm that will start controlling spam requests from this day on.
  • Untitled update – March 24, 2014. Google representatives never confirmed an update, however webmasters from around the world have noticed significant fluctuations in website ranking and considered it a “soft” Panda algorithm update.
  • Page Layout #3 – February 6, 2014. The company had refreshed its algorithm known as “Top Heavy”. It was initially launched in 2012 aiming at websites that have excessive ads in the top part of the website (a section that appears in front of user’s eyes immediately upon website downloading).

2013

  • Authorship Shake-up — December 19, 2013. This update was aimed at decreasing the number of snippets deployed with authorship layout pages. This was Google’s first fight against speculating with the Author Rank among webmasters.
  • Untitled update – December 17, 2013. Experts and webmasters everywhere had noticed historically unprecedented fluctuations in search results for websites containing authorship marks for the information contained on pages.
  • Untitled update – November 14, 2013.
  • Penguin 2.1 (#5) – October 4, 2013. Another algorithm update following the 4.5 month break. This update mostly affected data. Overall effect was deemed moderate, although some webmasters had noted severe damages to their websites.
  • Hummingbird – August 20, 2013. Another landmark event in the Google’s algorithmic analysis history. We have discussed the nature of this tool above.
  • In-depth articles – August 6, 2013. This update resulted in the list of websites with content relevant to search results being displayed in search results alongside with the general query. This will allow users to receive fullest possible information regarding the issue they’re interested in.
  • Untitled update – July 26, 2013.
  • Knowledge Graph extension – July 19, 2013.
  • Reinstatement of Panda – July 18, 2013.
  • “Multi-weekly” update – June 18, 2013. Since June 12 and during the week following July 5 significant volatility of website rankings had been noted, the most noticeable of them took place on June 27. At that, nature of these updates was never revealed and Google has never confirmed making the update.
  • “Payday Loan” Update – June 11, 2013. Focus was on search results related to payday loans and/or pornography websites. Update was expected to be released within 1-2 months of its announcement.
  • “Panda Dances” – June 11, 2013. Google representatives announced that from now on Panda will be rereleased monthly, however each update will take 10 days to happen.
  • Penguin 2.0 (#4) – May 22, 2013. Update that had been expected for some time and finally shone some light over the SEO world. However its affect on ranking was pretty moderate. Google made no announcements as for the nature of the update, but apparently it was related to updating website pages’ levels.
  • “Phanton” – May 9, 2013. Many notifications were published that day about possible algorithms updates, and many websites noticed significant traffic drops.
  • Panda #25 – March 14, 2013. Matt Cutts suggested that it will be the final Panda update before it is integrated with the main algorithm.
  • Panda #24 – January 22, 2013. First official update announcement in 2013. Affected over 1.2% of websites.

2012

  • Panda #23 – December 21, 2012. Google’s “Christmas Gift” – another update that had affected 1.3% of English-language sources.
  • Knowledge Graph extension – December 4, 2012. Spanish, French, German, Portuguese, Japanese, Russian and Italian were added to the database. However, update was related to activation of fine analytical mechanism rather than simple translation of results.
  • Panda #22 – November 21, 2012.
  • Panda #21 – November 5, 2012.
  • Page Layout #2 — October 9, 2012. Update was announced to filter websites with large amount of ads at the top of the page.
  • Penguin #3 – October 5, 2012. Minor algorithm update (only 0.3% websites affected) on the verge of a major update.
  • Package of 65 updates in August / September – October 4, 2012. Updates of the month had affected search results, extensions to knowledge graph, webpages quality assessment criteria and other important issues.
  • Panda #20 – September 27, 2012. Changes had affected 2.4% of websites.
  • EMD filter update – September 27, 2012. EMD looks for search results to contain no websites, which domain names are aimed at narrow users’ queries. Filter’s launch had a negative effect on 0.6% of English language websites that used this fraudulent practice.
  • Panda 3.9.2 (#19) – September 18, 2012.
  • Panda 3.9.1 (#18) — August 20, 2012.
  • 7-result SERP – August 14, 2012. Top 7 queries are now shown for most requests instead of Top-10, which had become a major system change. This update had affected 18% of requests.
  • Package of 86 updates in June / July – August 10, 2012. Large update package that had affected various aspects of search engine operating. Most of the updates had affected the Panda algorithm.
  • DMCA penalties (“Pirate”) — August 10, 2012. Google had announced its dedicateion to penalize websites that violate copyrights.
  • Panda 3.9 (#17) – July 24, 2012.
  • Distribution of warning notifications — July 19, 2012.
  • Panda 3.8 (#16) — June 25, 2012.
  • Panda 3.7 (#15) — June 8, 2012.
  • Package of 39 updates in May – June 7, 2012. Most of them were relevant to improving the Penguin and also to recognize linking schemes. Also included update and some changes to Google News.
  • Penguin 1.1 (#2) — May 25, 2012.
  • Knowledge Graph — May 16, 2012. SERP had received a new block integrated that will provide additional information about various people, places, and things. Great novelty by Google.
  • Package of 52 updates in April — May 4, 2012.
  • Panda 3.6 (#14) – April 27, 2012.
  • Penguin — April 24, 2012. Following continuous discussions of the release of “punisher of overly optimized websites”, the world had finally seen the Google’s novelty, the “Webspam Update” later called “Penguin”. According to original data, the new tool had affected 3.1% of English-language queries.
  • Panda 3.5 (#13) — April 19, 2012.
  • Data parking bug — April 16, 2012. Some domains had become treated as “parked” by the system as a result of the bug, which resulted in numerous claims from webmasters regarding shuffling of rankings.
  • Package of 50 updates in March – April 3, 2012. Updates included confirmation of Panda 3.4, changes to scoring of anchor texts, images search update, etc.
  • Panda 3.4 (#12) — March 23, 2012.
  • Quality video search — March 12, 2012.
  • Venice — February 27, 2012. Update of local search rankings. In particular, it had improved URL location placement; strengthened the “Freshness” function; spam is now better recognized in results, etc.
  • Package of 40 updates in February – February 27, 2012.
  • Panda 3.3 (#11) – February 27, 2012.
  • Package of 17 updates in February – February 3, 2012.
  • Overloaded top of the page – January 19, 2012. From this moment on the search engine penalizes websites that have too many ads at the top of the page (the part most visible to any user). Update had no official title, but was quickly called “Top Heavy” by the SEO community.
  • Panda 3.2 (#10) — January 18, 2012.
  • Search + Your World – January 10, 2012. Trying to actively promote user data and profiles on Google+. Google had announced radical personalization update. It had also added a new button that allows to turn off personalization.
  • Package of 30 updates in January – January 5, 2012.

2011

  • Block of 10 updates in December – December 2011.
  • Panda 3.1 (#9) — November 18, 2011.
  • Block of 10 updates in November — November 14, 2011.
  • Freshness Update — November 3, 2011. According to Google’s forecast this update should had affected up to 35% of queries.
  • Query encryption – October 18, 2011. Google had announced that they will encrypt user queries to ensure confidentiality.
  • Panda “Flux” (# 8) — October 5, 2011. Company representatives had announced that a stream of ranking fluctuation was to be affected related to activation of Panda. The update should had affected no less than 2% of queries.
  • Panda 2.5 (#7) — September 28, 2011.
  • Extended links to websites – August 16, 2011. Initially Google had displayed up to 12 links for branding requests. However, this number was brought down to 6, which appeared to be sufficient for quick transfer from SERP to relevant website sections.
  • Panda 2.4 (#6) — August 12, 2011. Update affected websites worldwide – both in English and in other languages interfaces (except for Korean, Chinese and Japanese). 6 to 9 percent of queries were affected.
  • Panda 2.3 (#5) — June 23, 2011.
  • Google+ — June 28, 2011. After several failed attempts in social media, Google decided to strengthen its position with Google+. Its meticulous programming allowed the company to enjoy 10 million users within merely 2 weeks.
  • Panda 2.2 (#4) — June 21, 2011.
  • Schema.org — June 2, 2011. Google, Yahoo, and Microsoft had announced the consolidated approach towards structured data. This shall improve search engine capabilities.
  • Panda 2.1 (#3) — May 9, 2011.
  • Panda 2.0 (#2) — April 11, 2011. Among other things, data included SERP websites blocked manually by website users or using the Chrome browser.
  • +1 button — March 30, 2011. This option allowed Google users to manually affect website ranking by giving “likes” to resources they like.
  • Panda / Farmer — February 23, 2011. Significant update that seriously affected about 12% of search results. It paid special attention to pages with poor content, content farms, sources with high advertising input percentage, etc.
  • Attributive update – January 28, 2011.
  • Overstock.com Penalty – January 2011. Exposure of shady SEO practices of the Overstock.com was made public and had enormous public uproar.

2010

  • Negative reviews — December 2010. Since the New York times had published an article about DecorMyEyes website, which remained a top of Google rankings despite its negative reviews. Google had responded that it will review its algorithms settings to avoid such situations in the future. From now on websites with negative reviews have little chance to receive beneficial positions in search results.
  • Social signals – December 2010. Google and Bing had confirmed that they now use social signals to determine rankings, including those from Facebook and Twitter. Matt Cutts had announced that it was a relatively new development for Google, although many optimizers had suggested such option for quite some time now.
  • Preview – November 2010. Google search results will receive a magnifying glass icon, which will allow users to take a quick look at previews of landing pages directly from the SERP. This function underlined Google’s heightened attention to the quality of target pages, their design and user friendliness.
  • Google Instant — September 2010. This search engine novelty ensured faster, more convenient use of the search window. Preliminary results now appear as you type the query, so that you don’t need to click the “search” button any more.
  • Branded update – September 2010. Same domain may appear several times in search results. In the past it was limited to 1 or 2 queries.
  • Caffeine. Update release – June 2010. Several months after testing the renewed search indexing version, Google had finally released its development for users to appreciate. According to its creators, the novelty had increased the speed of query processing and provided a better website indexing system. Google search relevance had improved by 50%.
  • Mayday – May 2010. Webmasters had noticed that traffics had become significantly lower in late April through early May. Matt Cutts had later confirmed that it might had been related to some changes to the algorithm. Websites with significant amount of “poor content” had suffered the most. This in turn allowed to presume the upcoming Panda update.
  • Google Places — April 2010. Google places had initially been launched in 2009, however they used to be merely a part of Google Maps. Now the officially released option has a tighter relevance to local search results and includes some local advertising links.

2009

  • Real-time search – December 2009. Real-time search was setup for Twitter channels, Google News, recently indexed content, etc. Sources continue to expand.
  • Caffeine (preview) – August 2009. Google presented the preview of significant infrastructure changes to search engine mechanisms that shall affect expansion of the search index, introduction of real-time ranking and indexing, etc.
  • Vince – February 2009. Optimizers announced a major update that apparently underlined search engine’s positive attitude towards known brands and large companies. Matt Cutts had presented Vince as a “minor update”, however webmasters believed that it will have significant and lasting consequences.
  • “Rel-canonical” tag – February, 2009. Google, Microsoft and Yahoo had announced the support of the “rel-canonical” tag that allows webmasters to make signals to search robots without them affecting regular users.

2008

  • Google Suggest — August 2008. A list of similar requests by other users is now presented under the search query for further user convenience. This will allow to speedup the use of the search engine, as usually you’ll only need to type in first letters/words of the search and then simply select your option from the dropdown list.
  • Dewey — April 2008. Significant fluctuations in search rankings were noticed in late March through early April. Some professionals had suggested that Google is working on its internal parameters, however they never got an official confirmation to these suggestions.

2007

  • Buffy — June 2007. To honor Vanessa Fox’s departure from the company, Google called the new update “Buffy”. Nobody actually knew what changes had been made, and Matt Cutts suggested to simply consider Buffy as a complex of minor changes.
  • Universal search – May 2007. Google connected traditional search results with News, Video, Images, Local, and other sections. The old SERP had become history.

2006

  • False alarm — December 2006. Despite everyone’s worries due to expected algorithms updates and messages regarding serious ranking changes, Google decided to make n official announcements of any significant updates.
  • Additional update – November 2006. Apparently Google had made changes to the supplemental index and to the process of recovering penalized pages. However, the company itself announced that the update itself bears no penalties (although it might sometimes seem otherwise).

2005

  • Big Daddy – December 2005. Technically, Big Daddy was an infrastructure update (somewhat similar to the recent “Caffeine”). Its release took several month and it was only in March 2006 that the update was seen at its best. “Big Daddy” had changed the URL canonization principles, redirecting (301/302), and reviewed some other technical issues.
  • Google Local/Maps — October 2005. After the launch of Local Business Center in March 2005 Google merged Maps data with the LBC that will further lead to some regional SEO changes.
  • Jagger — October 2005. Google had issued a series of updates mostly addressing poor quality links (including mutual links, linking farms and paid links). Jagger took 3 stages to publish, somewhere from September to November 2005. October was when the update had the biggest impact.
  • Gilligan – September 2005. This update is now often referred to as fake, because despite webmasters’ claims about changes, Google never confirmed a major update. Matt Cutts had explained that the search engine was updating its indexes on a daily basis at that time (except for particular section that are refreshed one every 3 months).
  • XML Sitemaps — June 2005. Google allowed webmasters to submit XML sitemaps via Webmaster Tools, avoiding traditional HTML sitemaps, thus providing optimizers with direct (although insignificant) effect on scanning and indexing.
  • Personal search – June, 2005. Unlike previous personalization attempts that required user settings and profiles, personalized search was launched in 2005 that allowed to automatically adjust search results depending on users interests. Google will continue to strengthen this feature, optimizing it to various applications.
  • Bourbon — May 2005. Despite its noticeable title, the update had little effect and mostly affected website with duplicate content and non-canonical URLs.
  • Allegra — February 2005. Webmasters had noticed changes in rankings, but the nature of update was unclear. Some people thought that Allegra affects the “sandbox”, while others believed that LSI was fixed with the update. Some also suggested that Google started penalizing for suspicious links.
  • “Nofollow” attribute — January 2005. To fight against spam and control outgoing links, Google, Yahoo and Microsoft had collectively introduced the “Nofollow” attribute. It helps to distinguish bad links, including spam comments in blogs. While it was not a traditional algorithm update, this amendment had nevertheless had a steadily growing effect on the linking graph.

2004

  • Google IPO — August 2004. While this was not an algorithm update, no doubt that this event was a significant landmark of the company history. Google had sold 19 million shares (at $85 per share) and raised $1.7 billion. By January 2005 value of its shares had more than doubled.
  • Brandy — February 2004. Google issued several updates, including those providing for massive index expansion, latent semantic indexing (LSI), increased attention to currency of anchor texts, etc. LSI had expanded Google’s ability to understand synonyms and helped it reach a new quality level in keyword analysis.
  • Austin — January 2004. Austin had come to improve on Florida bugs. Google continues to fight fraudulent schemes such as invisible text on websites and extensive meta tags. Some experts believe that Google activated its “Hilltop” algorithm in this update and began to pay special attention to relevancy of websites’ pages.

2003

  • Florida — November 2003. This update had caused significant ranking changes and many websites had lost their positions, which infuriated business owners. Florida had basically became a farewell to low-efficient poor quality websites from 1990s SEO era and a beginning of a new era that promised a much more comprehensive and interesting competition for SERP leadership positions.
  • Supplemental Index — September 2003. In order for index to be able to process many documents without harming the capacity, Google had withdrawn some pages into “supplemental” index. This topic was widely discussed in SEO professional circles (in mostly negative tones) until the index had become a single unity once again.
  • Fritz — July 2003 г. Google’s monthly “dances” had come to an end/ Instead of making a “capital reconstruction” of the system once every 30 days the company decided to make necessary changes to the index on a daily basis.
  • Esmeralda — June 2003. This was the last of regular monthly Google updates. “Google’s Dance” was replaced with “Everflux”, which allows to make more frequent changes to search engine’s infrastructure.
  • Dominic — May 2003 г. While there were certain changes in rankings in May, Dominic’s nature was unclear. Google launched “Freshbot” and “Deepcrawler” to browse the web, and many websites reported their rankings being up and down. Backlink weight counting had changed and, apparently, webmasters and website owners had been doomed to suffer some pain with it.
  • Cassandra — April 2003. Google began to actively pursuit websites that speculate on links invisible to users. For example, links placed on white background, that are invisible to users while present in the page’s source code. Such tricks are now recognized by bots.
  • Boston — February 2003. First official Google update that received a sound title. From this moment on Google launches a regular process of updating sites indexing and ranking parameters that will become more and more comprehensive and complicated over the years.

2002

  • First official update – September 2002. First update of search results.

2001

  • Hilltop — January 2001. All queries are no separated in commercial and noncommercial. Page Rank is now calculated differently, in particular, it takes into account dynamic ranking of documents.

2000

  • Google Toolbar – December 2000. Google released a toolbar for its browser, along with the Toolbar PageRank (TBPR). As soon as webmasters started monitoring the TBPR, Google Dance had begun. This was the actual birth of “classic” SEO that we know it today.

Conclusion

We’ve had a brief overview of major aspects of each algorithm and told you the story of updates in Google analysis and assessment tools. This shall help you better understand how the search engine works, what aspects in website design and content does it pay attention to.

This issue is very wide and complicated. However, the most important thing to remember here is that all three Google algorithms are actually in favor of webmasters that create truly useful websites, valuable to internet users. Therefore, if you are one of those specialists, your success will not be slow in coming.