PDA

View Full Version : Why Is A Robots.txt File Used?



worldescape
12-04-2018, 12:32 AM
Why Is A Robots.txt File Used?

vinukum
12-04-2018, 12:36 AM
Robot.txt tells the crawler that how it should crawl and index a particular web page of a website.

MVM Infotech
12-04-2018, 09:26 PM
Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol. The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site.

RH-Calvin
12-06-2018, 09:47 AM
Robots.txt is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.

neelseowork
12-06-2018, 10:15 PM
Website owners use the robots.txt file to give instructions about their site to web robots.

The "User-agent: *" means this section applies to all robots.
The "Disallow: /" tells the robot that it should not visit any pages on the site.