No, I just sent an email and fax to their host's legal dept (it was Rackspace).
Just checked now, and the index page of the copy of my site now redirects to the homepage: http://realestate.bizhat.com/guide/index.phpOriginally Posted by ASP-Hosting.ca
Will check email later to see if they said anything...
Are you going to deal with all of these? It can be really time consuming to do this on a regular basis...Originally Posted by Chris
I don't think it's very amazing. Your sites are mostly pretty high-profile, easily found in search engines, and the infringers are probably just searching for the same thing your visitors are, then copying. Now, what'd be amazing is someone copying entries from my blog. (Note: it syndicates great big blog so the copyright google thing would probibaly show that in the results) Though in the past I have seen some small sites who have had their content stolen. Not even very high quality content, either.You know, just Googling around, its amazing how many of my articles have been copied....
I think there is some rule about enforcing your copyright or loosing it, or maybe that is patents...
Its trademarks that you need to enforce.
Thats the reason why a companies are sueing any website that might have their domain in their name even if the site poses no threat at all to them. If some time in the future they want to get rid of a site that really is threatening their brand, but there are a bunch of other sites using their name in the domain, they will most likely lose the case because they haven't been enforcing their trademark.
Make more money - Read my Web Publishing Blog
the content theives = the scum of the net
How would you go about doing this? I've been looking for a way to prevent my site from being snaked by offline browsers/site grabbers.Originally Posted by Chris
I do this:
The commented out lines are where I made an autoban. So that if someone used a ripper, then turned it off or tried to change the user agent, they'd be banned in .htaccess.PHP Code:
$agent = strtolower($HTTP_USER_AGENT);
if ((strstr($agent, "rip")) ||
(strstr($agent, "get")) ||
(strstr($agent, "icab")) ||
(strstr($agent, "wget")) ||
(strstr($agent, "lwp-request")) ||
(strstr($agent, "Wg")) ||
(strstr($agent, "ninja")) ||
(strstr($agent, "Wget/")) ||
(strstr($agent, "reap")) ||
(strstr($agent, "subtract")) ||
(strstr($agent, "offline")) ||
(strstr($agent, "xaldon")) ||
(strstr($agent, "ecatch")) ||
(strstr($agent, "msiecrawler")) ||
(strstr($agent, "rocketwriter")) ||
(strstr($agent, "httrack")) ||
(strstr($agent, "track")) ||
(strstr($agent, "teleport")) ||
(strstr($agent, "teleport pro")) ||
(strstr($agent, "webzip")) ||
(strstr($agent, "extractor")) ||
(strstr($agent, "lepor")) ||
(strstr($agent, "copier")) ||
(strstr($agent, "disco")) ||
(strstr($agent, "capture")) ||
(strstr($agent, "anarch")) ||
(strstr($agent, "snagger")) ||
(strstr($agent, "downloader")) ||
(strstr($agent, "superbot")) ||
(strstr($agent, "strip")) ||
(strstr($agent, "block")) ||
(strstr($agent, "saver")) ||
(strstr($agent, "webdup")) ||
(strstr($agent, "webhook")) ||
(strstr($agent, "webdup")) ||
(strstr($agent, "pavuk")) ||
(strstr($agent, "interarchy")) ||
(strstr($agent, "blackwidow")) ||
(strstr($agent, "w3mir")) ||
(strstr($agent, "plucker")) ||
(strstr($agent, "naver")) ||
(strstr($agent, "cherry"))){
// $LogFile = '/home/aspen0/public_html/.htaccess';
// $rad = $_SERVER['REMOTE_ADDR'];
// $logline = "deny from ". $_SERVER['REMOTE_ADDR'] . "\n";
//$file = fopen("$LogFile", "a");
//flock($file, 2);
//fwrite($file, "$logline");
//flock($file, 3);
//fclose($file);
header("Location: http://www.online-literature.com/banned/banned.php");
My .htaccess files got huge though, and many people on shared ips (like aol users) were banned.
Chris, how does one go about implementing this script? Just Include it? That did not work very well for me.
oh, okay that makes sense. thanks.
Bookmarks