Hello there!

We are looking forward to use hadoop for our distributed system, however 
it's still unclear for us, is it possible to solve such tasks:

1) maintain fixed number of concurrent jobs executed at the same time

2) grant access for all jobs to the same static resource, which they should
share. This resource is mostly FIFO queue, which is filled by third-party
application, and could be filled by certain jobs if they met some conditions.
So the job can put data into this queue, which needs to be picked by this job
on text iteration or some another job

3) schedule new jobs when some of previously scheduled jobs completes, or
allows the jobs to check for new data in FIFO queue described above

Thank you in advance!

-- 
Eugene N Dzhurinsky

Reply via email to