Why Is A Robots.txt File Used?
Why Is A Robots.txt File Used?
Best beruwela tour packages|colombo tour packages|Best kandy Travel Packages|Best polonnaruwa tour packages|Best yala tour packages|east india tour packages|best andaman tour package|sri lanka tour operators|sri lanka travel agent|best travel agents in sri lankaBest chitwan tour packages|Best chitwan Travel Packages
Robot.txt tells the crawler that how it should crawl and index a particular web page of a website.
Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol. The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site.
Robots.txt is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.
Cheap VPS Hosting | VPS Starting from $12 PER Year
Cheap Dedicated Server | Unmetered Bandwidth | Free Setup and IPMI
Website owners use the robots.txt file to give instructions about their site to web robots.
The "User-agent: *" means this section applies to all robots.
The "Disallow: /" tells the robot that it should not visit any pages on the site.
Bookmarks