Should You Block Google from Crawling Internal Search Result Pages?
Should You Block Google from Crawling Internal Search Result Pages?
The Google Webmaster Guidelines still state that you should “Use the robots.txt file on your web server to manage your crawling budget by preventing crawling of infinite spaces such as search result pages.” ... However, blocking internal search pages in your robots.txt doesn't seem the right solution
Why would you want Google to be blocked from not crawling internal search result pages?
yes, you can block Google bots from crawling the subpages of your website using robots.txt.
Bookmarks