Why Is A Robots.txt File Used?
Printable View
Why Is A Robots.txt File Used?
Robot.txt tells the crawler that how it should crawl and index a particular web page of a website.
Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol. The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site.
Robots.txt is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.
Website owners use the robots.txt file to give instructions about their site to web robots.
The "User-agent: *" means this section applies to all robots.
The "Disallow: /" tells the robot that it should not visit any pages on the site.