> Are tasks being executed multiple times due to failures? Sorry, it was not > very clear from your question.
yes, and I simply want to skip them if they fail more than x times(after all this is big data :) ). -Håvard On Sun, Jan 6, 2013 at 3:01 PM, Hemanth Yamijala <[email protected]> wrote: > Hi, > > Are tasks being executed multiple times due to failures? Sorry, it was not > very clear from your question. > > Thanks > hemanth > > > On Sat, Jan 5, 2013 at 7:44 PM, David Parks <[email protected]> wrote: >> >> Thinking here... if you submitted the task programmatically you should be >> able to capture the failure of the task and gracefully move past it to >> your >> next tasks. >> >> To say it in a long-winded way: Let's say you submit a job to Hadoop, a >> java jar, and your main class implements Tool. That code has the >> responsibility to submit a series of jobs to hadoop, something like this: >> >> try{ >> Job myJob = new MyJob(getConf()); >> myJob.submitAndWait(); >> }catch(Exception uhhohh){ >> //Deal with the issue and move on >> } >> Job myNextJob = new MyNextJob(getConf()); >> myNextJob.submit(); >> >> Just pseudo code there to demonstrate my thought. >> >> David >> >> >> >> -----Original Message----- >> From: Håvard Wahl Kongsgård [mailto:[email protected]] >> Sent: Saturday, January 05, 2013 4:54 PM >> To: user >> Subject: Skipping entire task >> >> Hi, hadoop can skip bad records >> >> http://devblog.factual.com/practical-hadoop-streaming-dealing-with-brittle-c >> ode. >> But it is also possible to skip entire tasks? >> >> -Håvard >> >> -- >> Håvard Wahl Kongsgård >> Faculty of Medicine & >> Department of Mathematical Sciences >> NTNU >> >> http://havard.security-review.net/ >> > -- Håvard Wahl Kongsgård Faculty of Medicine & Department of Mathematical Sciences NTNU http://havard.security-review.net/
