How the google bot's works?
Printable View
How the google bot's works?
Google Bot works this way: https://support.google.com/webmaster...er/70897?hl=en
Crawling is the process by which Googlebot discovers new and updated pages to be added to the Google index.
Crawling is the process by which Googlebot discovers new and updated pages to be added to the Google index. We use a huge set of computers to fetch or "crawl" billions of pages on the web. The program that does the fetching is called Googlebot also known as a robot, bot, or spider.
Googlebot functions as a search bot to crawl content on a site and interpret the contents of a user's created robots.txt file. The searchable bots (robots) work by reading Web pages; then, they make the content of the pages available to all Google services (done by Google's caching proxy).