I would go with a global tally (in server memory) of the number of requests coming from a given client and deny access to that client if they exceed a threshold of requests/time period.
In order to identify a client you could concatenate some of the CGI variables and use the result (or a hash of the result) as a unique identifier for the client. If you pick the right variables you should be able to avoid banning multiple hosts who are going through a proxy and so having the same remote IP address. The advantage of this approach is that you don't have to keep updating your global database of banned IPs. If you're running a clustered site you should be able to tune the sensitivity of the filter to ensure that even if they hit a different server on every request they still get blocked. If a client goes over the threshold they would get a message telling them that they have been blocked and why, and how long they will have to wait before they can get into the site again. Spike Paul Vernon wrote: > Hi All, > > We've just had one of our sites 'stolen' by someone in the philipines using > an application called iCollect... I got tracert logs etc and will be > following up on this.... > > In the meantime, does anyone know of any IP / USER_AGENT filters that I can > just drop into an application file so that it protects sites against this > sort of thing.... > > If not, I'm thinking about writing something that will do this so that I can > protect all the sites that we host using CF and I'd like opinions on how I > would go about implementing it... > > > My ideas on this upto now are these... > > 1. A global DB per server with 1 table storing a list of banned user agents > and another table storing IP addresses of those machines using those user > agents and the time they were first added in... > > 2. A CF_ tag that is called in every application file to check the IP > address then the user agent against the global DB and either do nothing (no > match) or <cfabort> thereby stopping the site from rendering for that user > agent. > > 3. Some sort of rate monitor only allowing say 30 pages per minute and if > this rate is hit then the IP gets banned automatically. > > Obviously, this is not foolproof and is dependent on the user agent actually > telling the truth for part of it. The rate monitor would work every time for > people attempting to download the entire site though so its worth > considering but they would still get something just not everything like what > has happened this morning. > > When it comes down to it, I know I can't stop them all, I just want to make > life as difficult for these people as what I'm concerned about is people > setting up fraudulent copies of e-commerce sites that we manage for our > customers. > > Any help is much appreciated.. > > Paul > > > > ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~| Special thanks to the CF Community Suite Silver Sponsor - CFDynamics http://www.cfdynamics.com Message: http://www.houseoffusion.com/lists.cfm/link=i:4:186228 Archives: http://www.houseoffusion.com/cf_lists/threads.cfm/4 Subscription: http://www.houseoffusion.com/lists.cfm/link=s:4 Unsubscribe: http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=11502.10531.4 Donations & Support: http://www.houseoffusion.com/tiny.cfm/54

