I have many sites on my server with hundreds of thousands of pages of content that are all dynamically driven. I’ve optimized my sites pretty well and clients generate new content on a daily basis. Day to day operations are fine, but when an army of Googlebots hits several of the sites at a time that have hundreds of pages of content, it brings my database server to a crawl.
Is there anything that you can do to throttle the Googlebots? Something in robots.txt?
Bookmarks