Hi arun,

I can see that the output commiter is present in the reducer.
How to make sure thtat this commiter happens at the end of the job or does
it run by default at the end of the job.
I can have more than one reducer tasks.




On Sun, Apr 29, 2012 at 11:28 PM, Arun C Murthy <a...@hortonworks.com> wrote:

> Use OutputCommitter.(abortJob, commitJob):
>
> http://hadoop.apache.org/common/docs/r1.0.2/api/org/apache/hadoop/mapred/OutputCommitter.html
>
> Arun
>
> On Apr 26, 2012, at 4:44 PM, kasi subrahmanyam wrote:
>
> Hi
>
> I have few jobs added to a Job controller .
> I need a afterJob() to be executed after the completion of s Job.
> For example
>
> Here i am actually overriding the Job of JobControl.
> I have Job2 depending on the output of Job1.This input for Job2is obtained
> after doing some File System operations on the output of the Job1.This
> operation should happen in a afterJob( ) method while is available for each
> Job.How do i make sure that afterJob () method is called for each Job added
> to the controller before running the jobs that are depending on it.
>
>
> Thanks
>
>
> --
> Arun C. Murthy
> Hortonworks Inc.
> http://hortonworks.com/
>
>
>

Reply via email to