Results 1 to 6 of 6

Thread: Possible to block search engines...

  1. #1

    Possible to block search engines...

    From indexing certain pages of your site?

  2. #2
    I'm the oogie boogie man! James's Avatar
    Join Date
    Aug 2004
    Location
    Canada
    Posts
    1,566

  3. #3
    Thanks for the link!

  4. #4
    Question...what does this imply? I think I more or less understand a lot of it except this.

    The robots.txt file should be created in Unix line ender mode! Most good text editors will have a Unix mode or your FTP client *should* do the conversion for you

  5. #5
    Working. Masetek's Avatar
    Join Date
    Aug 2005
    Location
    Aust
    Posts
    543
    Unix and windows have different ways of determining a new line. For example in PHP unix/linux will understand \n for newline but windows needs \r\n (if i remember correctly!).

    You don't need to worry about this when ur dealing with robots.txt, it's a programming issue. Ur ftp client will sort it.


  6. #6
    Could you also use this or is it not a very reliable way to block pages from being crawled?

    <META NAME="ROBOTS" CONTENT="NOINDEX">

Similar Threads

  1. PPC Search Engines & Affiliate Links
    By cameron in forum Advertising & Affiliate Programs
    Replies: 8
    Last Post: 06-26-2005, 06:58 PM
  2. SEO paying someone to do it.
    By jr1966 in forum Search Engine Optimization
    Replies: 21
    Last Post: 09-08-2004, 06:21 AM
  3. Search engines are killing my writing skills!
    By Westech in forum General Chat
    Replies: 4
    Last Post: 06-15-2004, 09:22 PM
  4. New to Search engines
    By jazzy12 in forum Search Engine Optimization
    Replies: 3
    Last Post: 04-17-2004, 05:49 PM
  5. How much text is indexed by search engines?
    By flyingpylon in forum Search Engine Optimization
    Replies: 2
    Last Post: 12-22-2003, 09:07 PM

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •