What about running the proxy off a directory instead of a file then? I've found it easier to ban based on directory than file, and you might even be able to do some .htaccess modrewrite to do it to reinforce the robots.txt
Google updated robots.txt again for one of my sites today and is now blocking what I need... they must load a cache of it then re-verify later. Or it sure seems that's what they did
Now waiting for the others to catch-up
Bookmarks