A crawler is a program that visits Web sites and reads their pages and other information in order to create entries for a search engine index