Hi,

once you have started the jobtracker+namenode on your cluster, you can
launch a job from any node of the cluster.

AFAIK, to submit multiple jobs you need to do that yourself either:
 - by writing a bash script to launch several jobs.jar one after the other
 - by bundling several jobs in a single job.jar (calling API::
JobClient.runjob( job ) repeatdly for each job in jobs ), for this you have
to create a new instance of JobClient for each job.

On 12/01/2008, Natarajan, Senthil <[EMAIL PROTECTED]> wrote:
>
> Hi,
> I have some basic questions in Hadoop Job submission. Could you please let
> me know.
>
>
> 1)      Once Hadoop daemons (dfs, JobTracker etc...) are started by hadoop
> user.
>
> 2)      Can any user submit job to Hadoop.
>
> 3)      Or does each user has to start the Hadoop daemons and submit job.
>
> 4)      Is there any queue available, like condor, to submit multiple
> jobs.
>
> Thanks,
> Senthil
>

Reply via email to