i'm interested in what the amount of data is that you attribute to google that makes you feel this is warranted? are you finding it to be quite large? or is it that you have a small plan for your server? anyway, there is some quite helpful info on google's webmaster guidelines page, which also points you to the page Chris refers to:

http://www.google.com/support/webmasters/bin/answer.py?answer=35769

and their tools:

https://www.google.com/webmasters/tools/dashboard?pli=1

and, though not what you asked about, also this about the sitemap xml file that among other things can instruct a search engine how often content changes so presumably reduce the frequency with which it impacts on your data cap:

http://www.sitemaps.org/protocol.php

Cheers,
Roger


[EMAIL PROTECTED] wrote:
66.249.71.108 - - [08/Oct/2008:12:50:49 +1300] "GET /ben/set002/pages/IMG_4491.html HTTP/1.1" 200 1515 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"

Google and other search engines are running through my home web server and now using up my data cap a bit fast.

What's the best way to make apache look at the refer(?) and send it packing?

Cheers Don

Reply via email to