This article will teach you to understand the Google search engine algorithm and how to predict its changes going forward. We’re going to do this by examining the path the search engine company has taken to improve its algorithm, and we’re going to make assumptions about where the algorithm could change in the future and what this means for your search engine optimization efforts.
Let’s track search engine rankings throughout history. The first update we have to consider was in 2003. Let’s go all the way back to 2003. Stay with me now.
Early Google Search Quality Updates: Beating Black-Hats
Imagine the date is February 2003. You’ve just watched the Buccaneers beat the Raiders in the Super Bowl, at which the Dixie Chicks sang the national anthem. Google rolls out a search engine update, Boston, which 99.9% of the world had no idea about. They promised to roll out additional updates every month. In April, they followed through with their second update, Cassandra, which cracked down on link-quality issues, including massive linking from co-owned domains. Cassandra also marks one of the first time that Google begins cracking down on truly dishonest SEO practices: the use of hidden text and hidden links (not only was this common, it was expected up until this point as a way to “game” the search engine, which wasn’t seen in a negative light). The death knell for this attitude came in November, with the Florida update. It was at this time that Google began penalizing sites for low-value late 90s SEO tactics, like keyword stuffing. The response from business owners was extreme: many were furious that their sites had dropped off the Google index. This period from late-2003 to mid-2005 represents a strong crack-down on black-hat SEO. This also marks the time when Google and leading search engine optimization websites begin working together to release information about how to properly conduct search engine optimization. It was the beginning of white-hat, Google-approved search engine optimization.
Trends Toward Personalized Search
In the middle of 2005, Google launched personalized search, which tracked a user’s search history and provided customized results based on what was clicked on. These days, personalized search is almost invisible to the end user, but when doing search engine optimization, always remember that the results you see are not the results that I see, even if we search for the same search engine term. This is especially true if I favor certain sites, like my own website isamuel.com, I’ll often get search engine results from blog articles I’ve personally written. This period also included the launch of Google Maps, which eventually helped to localize and personalize search results, and Universal Search, which rolled News, Photos, Videos, and Shopping into the search engine result page. This user-interface update work continued until 2009, which included Real-time Search, where Twitter and other News are updated in real-time on the search engine, and Caffeine, which provided additional layout changes to the user interface of the index.
The Fight Against Low Content
In May 2010, webmasters noticed another major drop in rankings associated with a Google Index update. Sitess with large numbers of long-tail keywords and thin content were penalized again after Google realized that webmasters were creating hundreds of thousands of sites, each targeting one or two long-tail keywords. These domains, like teachingassistantpositionsingeorgia.com, really were designed only as lead-generation engines. Management of these hundreds of thousands of sites were outsourced to software applications and content generation was done by teams of hundreds of thousands of writers, most from third-world countries and paid minimum wage. Because this kind of content offers little or no value to a reader, Google penalized sites that were structured in this manner.
Google Going Forward: Social Signals
Let’s take a break here and analyze where the search engines have focused their time and attention thus far in the story. Primarily, Google has targeted “black-hat” SEO and other deceptive practices that were causing low-quality websites to rank unfairly. This is a problem for Google’s end mission: to deliver high-quality, attractive content to searchers. Now Google had a problem to solve: how do they create search engine listings that are vetted by users, rather than by an algorithm looking at keywords and backlinks, which can be manipulated? This crowdsourcing idea eventually culminated in small part with the social signals development, where Google began looking at content sharing on Facebook and Twitter, in late 2010, and finally in 2011 where Google allowed the +1 button to influence search results within one’s social circle. That means, if I +1 a page, my friends will see that page promoted to the top of their listings, meaning search engine optimization experts were no longer islands, promoting their content by themselves: now, they have to convince people their content is good enough to share socially.
Google no longer needed to look solely at content to determine a site’s rank, which is the criteria that search engine optimization specialists had been focusing on for the past ten or more years. To crack down further on content manipulation, they implemented Panda in 2010 and 2011, perhaps the most famous algorithm update of Google’s history. Panda cracked down further on low-content sites, sites with high ad-to-content ratios, sites that were keyword stuffing, and a number of other quality issues. Next, Google rolled out a major freshness update, which meant that content older than several hours would start seeing drop-offs in index rankings. This change affected up to 35% of search queries, especially when there was new information being published on the topic regularly.
The New Role of Search Engine Optimizers
Search engine optimization has changed a lot since the late 90s. Going forward, search engine optimizers will have to be marketers, not just techies. Getting your content talked about on social networks and referenced in other blogs is key. We also expect to see search engines turn into “knowledge engines” rather than simply indexed pages. Likely, schema tagging and other types of content-description metatags will contribute to which pages’ content are used to create the “knowledge index” rather than the search results index.
Inside a Google Search Quality Meeting
This is just an overview of the update work that’s been done on Google throughout the last ten years. If you’d like to see this data translated to an infographic and more detail on individual upgrades included, please leave a message in the comments section.