@Debraj,

Until you can accurately characterize the request/response that users are
sending as input it's hard to give good advice on this.  If they are
sending a bunch of data and just expecting an acknowledgement back that
you've receive it and are processing it then there are fairly simple ways
of dealing with it.  On the other hand, if they are sending you data and
you have to make all the DB calls, REST calls, number crunching and then
send the results back it's a bit trickier but not impossible.

Are you sure that it is OK to just bounce your client's request?  That
means they'll have to have retry and/or error handling on their side.
That's usually undesirable.  But if you want a fail fast like that then you
should do it on the endpoint coming in by limiting the max requests/threads
as shown in the Jetty documents.  I'll assume for now that you are using
Jetty as that's probably the most common.

Jetty also has "continuations" that you may want to look into.
Essentially the continuation doesn't hang the input thread, permits you to
put the data on a queue for asynchronous process (SEDA or JMS persistent
depending on data type) and then wait for the response from your
asynchronous processing before sending the response back to the user.

You need to characterize the problem better.  Think of a request to Amazon
for an order that has a credit card to be charged, database look ups and
reservations for inventory, shipping information, etc.  You don't sit and
wait for Amazon to process all that.  The send a quick response back
thanking you for your order and later send an email telling you your order
has been successfully processed and later send you an email that your order
has been shipped.  I'm not saying that that exact thing is right for your
business case but the point is that they send a very fast response
essentially acknowledging they've received your order and are processing
it.

Brad

On Sun, Oct 2, 2016 at 11:32 PM, yogu13 <[email protected]> wrote:

> I think throttler can be used in your case. Unless you feel there couldnt
> be
> a issue when you use it.
>
> Other approach is by using  Route Policy
> <http://camel.apache.org/routepolicy.html>
>
> Regards,
> -Yogesh
>
>
>
> --
> View this message in context: http://camel.465427.n5.nabble.
> com/Limit-Concurrent-Access-tp5788278p5788304.html
> Sent from the Camel - Users mailing list archive at Nabble.com.
>

Reply via email to