I do find that some bots (particularly bad bots) try to sleuth out URLs from JavaScript and visit those URLs. For best SEO practices one should not rely on SE bots seeing or following any JavaScript URLs. At the same time it might be a good idea to structure tracking URLs like the one above such that they can be added to a deny list in your robots.txt file. For example you might do:
Code:
<a href="http://www.photoshopguides.com" onfocus="this.href='track.php?page=/index'">Photoshop Guides</a>
<a href="/contact.html" onfocus="this.href='track.php?page=/contact'">Contact us</a>
Then add the following to your robots.txt file:
Code:
User-agent: *
Disallow: /track.php
On the back side in (in PHP or whatever) you could append the ".html" if you so desired. Renaming "URL=" "page=" and removing the domain name and ".html" will stop some bots from trying to sleuth out the URL of the JavaScript. The robots.txt file will stop legitimate bots (e.g. Google bot) from trying to index your tracking script page.
Bookmarks