Hi, Yeah, that could be a solution if that was the case. I didn't explain enough in depth the problem.
When you perform a request, you provide certain arguments that are cointained in the request itself, so it's not only a matter of having a background job that collects information updating the database. User A requests some kind of action over a selected router, and provides some arguments (e.g. ping a certain IP provided by the user). So, what I need is to set some kind of limit of how many requests can be concurrently running. This could of course be set on the router itself by limiting the number of SSH sessions, but just wanted to know if there was some 'rails way' of performing this check. I think I will do a typical solution like shared memory or some kind of lock. Best regards, Rafael Fernández López. -- You received this message because you are subscribed to the Google Groups "Ruby on Rails: Talk" group. To post to this group, send email to [email protected]. To unsubscribe from this group, send email to [email protected]. For more options, visit this group at http://groups.google.com/group/rubyonrails-talk?hl=en.

