Hello

I am sure this has been solved before.  I have a mailing list that is
about 30 years old that is archived in mysql and displayed through
embperl.

Lately I have had a series of robots and cralers, mostly from google
cloud ips and AWS ips that hammer the site with no reard for the network
and I want to slow them down.  I can think of a few ways to do this,
including adding a timestamp variable to the page and check it before
releasing content.  I'm not sure they would care if they get 50,000
pages of blank output.  I'm wondering how this has been previously
solved?


Reuvain

-- 
So many immigrant groups have swept through our town
that Brooklyn, like Atlantis, reaches mythological
proportions in the mind of the world - RI Safir 1998
http://www.mrbrklyn.com 

DRM is THEFT - We are the STAKEHOLDERS - RI Safir 2002
http://www.nylxs.com - Leadership Development in Free Software
http://www2.mrbrklyn.com/resources - Unpublished Archive 
http://www.coinhangout.com - coins!
http://www.brooklyn-living.com 

Being so tracked is for FARM ANIMALS and extermination camps, 
but incompatible with living as a free human being. -RI Safir 2013


---------------------------------------------------------------------
To unsubscribe, e-mail: embperl-unsubscr...@perl.apache.org
For additional commands, e-mail: embperl-h...@perl.apache.org

Reply via email to