I'm being visited once a day or so by googlebot. Thing is, it only crawls the entrance page and has never been beyond that.
The site's kind of new, but it does have a temporary listing. I would have thought it would have been deep crawled by now considering the number of times googlebot's visited the site.
I'm not using any robots.txt file and my links are standard like:
http://www.mydomain.com/linkdir/
And I'm not using any meta tags.
I really want it to get deep crawled before the next update. Is there nothing I can do about it? Or are there any reasons beyond the robots.txt file that would prevent a deep crawl?
Bookmarks