The relationship of SEO with Google

google-seoGoogle uses PageRank as a function of the quantity and strength of inbound links. PageRank estimates the likelihood that a given page will be reached by a web user who surfs the web by following links from one page to another. A higher PageRank page is more likely to be reached by the random surfer.

Search engines typically incorporate a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation and this is what Google is doing as well. The algorithms used to rank pages by the leading search engines Google, Bing, and Yahoo have never been disclosed. This being the case some SEO practitioners have studied different approaches to search engine optimization.

A decade ago, Google began personalizing search results for each user by making use of their history of previous searches. This way Google crafted results for users that were logged in. Bruce Clay mentioned that “ranking is dead” because of personalized search. Because website ranking would potentially be different for each user he made the above statement.

Google caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before and a statement made by a Google employee perfectly described the situation as he said that “Caffeine provides 50 percent fresher results for web searches”.

With the growth in popularity of social media sites and blogs the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results. This lead to introducing the Google Instant, real-time-search in late 2010 in an attempt to make search results more timely and relevant.

Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice and as a result Google announced the Panda update in February 2011, which penalizes websites containing content duplicated from other websites and sources. This way Google implemented a new system which penalizes sites that contain duplicate or non-unique content. The following year in 2012 Google Penguin was an attempt to penalize websites that used manipulative strategies to improve their rankings and the 2013 Google Hummingbird update included an algorithm change for improving Google’s natural language processing. This update also improved the semantic understanding of web pages.

Leave a Reply

Your email address will not be published. Required fields are marked *