Google uses PageRank(now DA: domain authority) as a function of the quantity and strength of inbound links. DA estimates the likelihood that a given page will be reached by a web user who surfs the web by following links from one page to another. A higher domain authority page is more likely to be reached by the random surfer.
Search engines typically incorporate a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation and this is what Google is doing as well. The algorithms used to rank pages by the leading search engines Google, Bing, and Yahoo have never been disclosed. This being the case some SEO practitioners have studied different approaches to search engine optimization.
Google began personalising search results for each user
A decade ago, Google began personalising search results for each user by making use of their history of previous searches. This way Google crafted results for users that were logged in. Bruce Clay mentioned that “ranking is dead” because of personalized search. Because website ranking would potentially be different for each user he made the above statement.
Google caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before and a statement made by a Google employee perfectly described the situation as he said that “Caffeine provides 50 percent fresher results for web searches”.
With the growth in popularity of social media sites and blogs the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results. This lead to introducing the Google Instant, real-time-search in late 2010 in an attempt to make search results more timely and relevant.
Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice and as a result Google announced the Panda update in February 2011, which penalizes websites containing content duplicated from other websites and sources. This way Google implemented a new system which penalizes sites that contain duplicate or non-unique content. The following year in 2012 Google Penguin was an attempt to penalize websites that used manipulative strategies to improve their rankings and the 2013 Google Hummingbird update included an algorithm change for improving Google’s natural language processing. This update also improved the semantic understanding of web pages.