How Search Engines and Ranking Works

Before a search engine can display relevant results to an end user it needs to archive the information that is available on the web. This is achieved by small pieces of software commonly referred to as ‘spiders’ that crawl the web by scanning for content and following links. These results are then returned to the main ‘bot’.

A bot (web robot) is a program that runs automated applications over the Internet. Google’s bot indexes results from the spiders and uses proprietary methods to rank the websites for particular results (keywords) that end users search for.

The proprietary methods used to rank these search results are collectively known as the algorithm. The search engines do not make all of the factors that go into their algorithm public knowledge. If they released this information then it would be possible for anyone to work their way to the top of the search results.

An SEO company will attempt to reverse engineer the algorithm by taking the information that is available and combining it with their own testing to find out what works. The proof lies in whether a website ultimately climbs in rankings or drops.