Results 1 to 8 of 8

Thread: What is disallow in robots.txt file?

  1. #1
    Registered
    Join Date
    Jan 2019
    Posts
    259

    What is disallow in robots.txt file?

    What is disallow in robots.txt file?

  2. #2
    Registered PoolMaster's Avatar
    Join Date
    Apr 2019
    Location
    USA
    Posts
    219
    Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol. The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site.

  3. #3
    The second line in any square of directives is the Disallow line. You can have at least one of these lines, indicating parts of the site the predefined arachnid can’t get to. A purge Disallow line means you’re not disallowing anything, so fundamentally it implies that spider can get to all areas of your site.

    User agent: *

    Disallow:/

    The case above would obstruct all search engines that “tune in” to robots.txt from crawling your site.

    User agent: *

    Disallow:

    The case above would, with just a single character less, allow all search engines to crawl your whole site.

  4. #4
    Registered
    Join Date
    Jun 2019
    Posts
    133
    It is an instruction to the Search Engine to prevent (restrict) accessing of specific pages or directories.

  5. #5
    Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol. The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site.

  6. #6
    Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol. The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site.

  7. #7
    Registered RH-Calvin's Avatar
    Join Date
    Mar 2017
    Location
    Forum
    Posts
    1,667
    Disallow states that the webpages listed are restricted from search engine crawling.
    Cheap VPS Hosting | VPS Starting from $12 PER Year
    Cheap Dedicated Server | Unmetered Bandwidth | Free Setup and IPMI

  8. #8
    It is the instruction for search engine bot to not the crawl specific page of that website.

Similar Threads

  1. Why Is A Robots.txt File Used?
    By worldescape in forum Search Engine Optimization
    Replies: 4
    Last Post: 12-06-2018, 10:15 PM
  2. Is it necessary to use Robots.txt file?
    By newsheadlines in forum Search Engine Optimization
    Replies: 8
    Last Post: 02-21-2018, 01:57 AM
  3. What is Robots Text File
    By Nexevo Technologi in forum Search Engine Optimization
    Replies: 11
    Last Post: 08-01-2017, 05:14 AM
  4. How to Create the Robots txt file.
    By Nexevo Technologi in forum Search Engine Optimization
    Replies: 2
    Last Post: 03-16-2016, 03:55 AM

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •