PDA

View Full Version : Possible to block search engines...



deronsizemore
10-02-2005, 12:05 PM
From indexing certain pages of your site?

James
10-02-2005, 02:40 PM
Yes, use robots.txt

http://searchengineworld.com/robots/robots_tutorial.htm

deronsizemore
10-02-2005, 02:58 PM
Thanks for the link!

deronsizemore
10-02-2005, 03:00 PM
Question...what does this imply? I think I more or less understand a lot of it except this.


The robots.txt file should be created in Unix line ender mode! Most good text editors will have a Unix mode or your FTP client *should* do the conversion for you

Masetek
10-02-2005, 06:21 PM
Unix and windows have different ways of determining a new line. For example in PHP unix/linux will understand \n for newline but windows needs \r\n (if i remember correctly!).

You don't need to worry about this when ur dealing with robots.txt, it's a programming issue. Ur ftp client will sort it.

:cool:

deronsizemore
10-10-2005, 05:56 PM
Could you also use this or is it not a very reliable way to block pages from being crawled?

<META NAME="ROBOTS" CONTENT="NOINDEX">