A4D v3+

 

After reviewing my web logs I can see that a range of IP address has been
crawling my customer's sites.

 

I can think of a way to detect and prevent them from causing too much
activity by using arrays of IP Addresses, Request Counters, and Time-of-Day
that which when SamePageRequests > HourlyCounterLimit(maybe 45), refuses
further responses to that IP address for 1 hour.

 

Is there a better way to solve this problem beside using the HTML and
robots.txt?

<META NAME="ROBOTS" CONTENT="NOINDEX"> 

<META NAME="ROBOTS" CONTENT="NOFOLLOW"> 

 

TIA!

 

David

_______________________________________________
Active4D-dev mailing list
[email protected]
http://mailman.aparajitaworld.com/mailman/listinfo/active4d-dev
Archives: http://mailman.aparajitaworld.com/archive/active4d-dev/

Reply via email to