Search engines go from one web page to another, following links found on each page. This process is known as "crawling" the web. The program that crawls is known as a "spider" or "bot". Once a spider finds a web page, it places it in the search engine's index. When a search is conducted, the results are pulled from the index and sorted based on the search engine's algorithm.