PDA

View Full Version : Too many Googlebots



HiddenVorlon
06-06-2004, 08:01 PM
I have many sites on my server with hundreds of thousands of pages of content that are all dynamically driven. I’ve optimized my sites pretty well and clients generate new content on a daily basis. Day to day operations are fine, but when an army of Googlebots hits several of the sites at a time that have hundreds of pages of content, it brings my database server to a crawl.

Is there anything that you can do to throttle the Googlebots? Something in robots.txt?

otherground
06-07-2004, 12:04 AM
Could you possibly cache your dynamic content to help reduce the amount of work that the database has to do ?

Peter T Davis
06-07-2004, 09:25 AM
Are you using the same script on all of your sites? Maybe there are some problems in the script that need fixing.

HiddenVorlon
06-07-2004, 06:41 PM
I'm looking at the caching now... that might help. The sites are very dynamic: navigation, content, footer, etc. are all database driven. Under normal operation this runs just fine. It's just the sheer volume of googlebots hitting the thousands of pages just brings the system to a crawl.

otherground
06-07-2004, 06:58 PM
alternatively you could just block the Googlebot ... but your site won't be indexed ..

paul
06-07-2004, 07:10 PM
I saw this posting on Sitepoint that might apply to your situation.

http://www.sitepoint.com/forums/showpost.php?p=1145730&postcount=16

HiddenVorlon
06-07-2004, 08:01 PM
That post looks like it could help. I'm running IIS, but it should have similar settings...

Percept
06-08-2004, 02:31 AM
alternatively you could just block the Googlebot ... but your site won't be indexed ..

That's not something I would recommend ;)