Hello, So after adding a few print statements, I've found that the "while (!jobControl.allFinished())" loop actually terminates, and so there must be a non-daemon thread somewhere that is preventing the program from terminating. Am I missing a configuration option to specify this? Any ideas where it might be?
Thanks! - Mike On Sun, Aug 19, 2012 at 9:46 AM, Michael Parker <[email protected]> wrote: > Hi all, > > I have a two-stage MR, written using entirely the new API, that I'm > running using Hadoop 1.0.3. In the end it generates a file named > part-r-0000 with the correct output, but the main method doesn't > terminate, despite me using the allFinished method of JobControl as I > thought I was supposed to. Any help would be greatly appreciated. A > snippet from my code follows. > >>>> > > import org.apache.hadoop.mapreduce.Job; > import org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob; > import org.apache.hadoop.mapreduce.lib.jobcontrol.JobControl; > > ... > > public static void main(String[] args) throws IOException { > ... > > Job job1 = new Job(new Configuration(conf), "job1"); > ... > ControlledJob controlledJob1 = new ControlledJob(job1, > Collections.<ControlledJob>emptyList()); > > Job job2 = new Job(new Configuration(conf), "job2"); > ... > ControlledJob controlledJob2 = new ControlledJob(job2, > Arrays.asList(controlledJob1)); > > JobControl jobControl = new JobControl("main"); > jobControl.addJobCollection(Arrays.asList(controlledJob1, controlledJob2)); > Thread t = new Thread(jobControl); > t.start(); > > while (!jobControl.allFinished()) { > try { > Thread.sleep(1000); > } catch (InterruptedException e) { > // Ignore. > } > } > } > > <<< > > Thanks! > > - Mike
