I have been trying to figure out if this is possible - and I may be missing
the answer completely...or maybe it's just not possible..

What I need to do is this:

I am trying to create a generic process that will accept SQL queries (that
get sent to the trigger directory on a scheduled basis with a CRON entry) -
the queries will take a (relatively) long time to execute.  So if a second
query is scheduled to run while the first is still running - I want the
second to run in a different thread so they can run simultaneously.

I noticed there is a "concurrentConsumers" property on a JMS endpoint.  Does
that do essentially this process, however it is only for JMS?  Like you
could set concurrentConsumers = 4 - and with a single application you can
have it processing 4 JMS messages simultaneously?  (if it's a Queue - same
message, 4 times if it is a Topic).

This is basically the route that I am using for the process:

from("file://c:/triggerdirectory").process(someProcessor);

However this will (in my understanding) only handle 1 file at a time.  So if
a second file arrives, it will not get processed until the first one is
finished and returns from the processor.

Reading the concurrentConsumers documentation
(http://activemq.apache.org/camel/competing-consumers.html) - I am wondering
if something like this might work...

from("file://c:/triggerdirectory").to("seda:filetrigger")
from("seda:filetrigger?concurrentConsumers=5").to(someProcessor);

Will that then cause the triggered file to go the seda queue and then have
the second route do the parallel processing (up to 5 at a time) of the file
while the first route will go back to listening to the directory?  Assuming
this is all defined within a single camel context and running application.

Any ideas?  Thanks in advance!
-- 
View this message in context: 
http://www.nabble.com/Ability-to-process-multiple-files-%28file-endpoint%29-simultaneously-in-a-single-Application-tp21082906s22882p21082906.html
Sent from the Camel - Users mailing list archive at Nabble.com.

Reply via email to