On Oct 3, 9:50 am, [EMAIL PROTECTED] (Tom Phoenix) wrote: > On 10/2/07, Tequila_Sunrise <[EMAIL PROTECTED]> wrote: > > > I want to place a timer on my post function so that users can only > > post every 30 seconds to prevent flooding. > > Of course, this doesn't have anything to do with Perl. You'd use the > same technique with any other programming language. (That's a sign > that maybe a forum specifically for *Perl* beginners might not be the > best place for the question.) > > > I thought about using cookies that expire every 30 seconds > > That's probably a bad idea, for many reasons. (For one, user clocks > aren't likely to always be synchronized with yours to within 30 > seconds.) But it's fairly straightforward to prevent double-posts, > without any cookies at all: > > When a request comes in to your program, compare it with a small > database of recent requests. If it matches, and the time was within the > past 30 seconds, quickly give the user a "server error". > > That's the idea in a nutshell. But there are some important details. > > The big problem is to efficiently compare this request with all recent > ones. This is solved if you compute a digest hash of the request. > (Despite the term "hash", this has nothing to do with Perl hashes.) A > hash (digest) algorithm will combine all of the form items into a > single, compact piece of data. You could do this with the MD5 > algorithm, or with any of the SHA- algorithms, such as SHA-1. There > are modules on CPAN to use these algorithms in Perl. > > The input to the hash algorithm includes everything that you know will > be the same in repeated requests: form values, remote IP address, > probably nothing else. The output of the hash algorithm will be a > chunk of random-looking data, in hex or some other format. You can > save the last few hundred seconds worth of requests in a flat file; > each line in the file is a hash followed by a timestamp: > > 01234DEADBEEF04321 1192424693 > 98765CAFEBABE98765 1192424693 > 33333CAB0FEED33333 1192424695 > > When a duplicate request comes in, it will have a matching hash value. > If the timestamp is within the last 30 seconds, I'd just give the user > a simple "server error" page. I might even include a snarky comment in > the error text that says that "sometimes this is caused by impatient > users who click submit more than once." But I'd be sure to say that > they should use their browser's Back function to return to the form, > wait a full minute, and submit it again. They don't really have to > wait for a full minute, but they do have to wait for at least the rest > of the 30 seconds since their first request. > > Of course, that assumes that they actually _should_ re-submit their > request. If you mean to tell them that their request has already been > submitted and doesn't need to be re-submitted, I'd return a page that > says that. If you can, give a link on that page to the result of the > original submission. (That is, if they were posting a comment, give > them a link to the page with the posted comment. But beware > concurrency issues; the process that's posting the comment may not > have finished yet. It may be safer to give a different link.) > > When a non-duplicate request arrives, you'll need to add it to the > file. Don't add it again when it is a duplicate, or it locks the user > out for even longer. > > Whenever you're accessing the file, whether you're updating it or not, > you'll need to lock it; see flock(). You'll need to periodically > remove old entries; I'd do that relatively rarely, as the file will be > exclusively locked for a relatively long time; say, only when the > oldest (first) entry in the file is more than 300 seconds old. As soon > as you're done with the file, close it in order to quickly release the > lock. > > If your form is very simple, many requests may be identical. That's > one reason to include the remote IP address in the hash. But multiple > users can share the same IP address, and you don't want to > accidentally deny them access. If your form is very simple, add a > hidden field with a small piece of randomly-generated data when the > form is generated for each user, to ensure that each user has > different form contents. > > Does that give you what you need? You may want to tweak some details > for your needs. If you have trouble implementing any parts of this in > Perl, please feel free to ask here for more help. Cheers! > > --Tom Phoenix > Stonehenge Perl Training
I like the lock idea alot! I can see where your going with it. I was also researching timestamps using the epoch method. I can simply set a users cookie to the current epoch after they post. When they post again (or the first time they post) it checks to make sure there is no epoch value stored or if the epoch value stored is less than or equal to currentepoch-30. This way at least on my system it will remain accurately timed. As far as the reasoning, my script has no limits on who can use it or how often. I wanted to make sure that people dont flood my web server with stupid stuff. -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] http://learn.perl.org/