On Tue, 2003-10-28 at 15:49, Justin Ruthenbeck wrote:

> Here's the situation (correct me if I'm wrong):
>    + User fills out a form and clicks submit
>    + The browser submits the form and sits in a wait state
>    + The server begins processing a request for a new record
>    + The user clicks submit once again
>    + The browser submits the form and sits in a wait state again
>    + The server begins processing a second (identical) request
> 
> This is classic double submission where two requests for the same thing 
> overlap on the server.  Your server thinks they are independent, but you 
> only want one to complete since it's actually a user error.

You understand the situation perfectly.

> And if you're concerned about this affecting your long term memory 
> performance, there are ways to mitigate the impact ... you almost surely 
> don't need this level of guantees for every request.

I did the numbers and found that for our expected workload we couldn't
possibly use more than a couple of megs of memory keeping track of a
unique token for each form served. I'm dropping them after some time (I
haven't decided how long yet, but I think a couple of hours is probably
long enough) which should keep the memory usage under control.

Thanks everyone for your input.


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to