I had the same problem a few years ago when I started the first zip
code map. Before month one was up, there were over 1000 different
robots attacking my server, trying to download the data I had there. I
put an end to this behavior by adding a "hit" counter onto the xml
server program which would stop any IP that had over so many hits. I
set the hit counter high enough so that no human user would have a
problem but low enough so that a robot would be blocked within
minutes. Overnight the attacks were thwarted. I've since instituted
even stronger measures to keep the determined robot from taking data
from the servers and even offered a subscription service to the data
for those who wanted to purchased "hits." This has been a good thing
for everyone as the responsiveness of our servers and the quality of
our data is high.

I suggest you institute a similar strategy on your server. Technically
what I do is add or update a record in an "IP" table with the number
of hits and the time of the last hit. If the number of hits exceeds
the threshold, the program denies access to the data. Also, with a
cron job, any record that hasn't been "hit" in a certain time is
deleted, thus resetting the counter for that IP. Those that hit limit
just have to wait the time period (two hours in this case) before
being allowed access again. The page itself tells the user if they've
exceeded the hit limit but doesn't deny them using the rest of the
mapping stuff, so it doesn't violate any sort of terms for the Google
portion of the map.

-John Coryat

http://maps.huge.info

http://www.usnaviguide.com
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Google Maps API" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/Google-Maps-API?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to