You can chain job submissions at the client. Also, you can run more than one job in parallel (if you have enough task slots). An example of chaining jobs is there in src/examples/org/apache/hadoop/examples/Grep.java where the jobs grep-search and grep-sort are chained..
On 1/18/09 9:58 AM, "Aditya Desai" <aditya3...@gmail.com> wrote: > Is it possible to call a mapreduce job from inside another, if yes how? > and is it possible to disable the reducer completely that is suspend the job > immediately after call to map has been terminated. > I have tried -reducer "NONE". I am using the streaming api to code in python > > Regards, > Aditya Desai.