Spiders (also known as web crawlers) are the computer programs that methodically go through the World Wide Web and gather information on the sites and links the program encounters. Spiders are used by search engines to record the information for indexes that is used in search engine results.