Hi Jay, Fail a single task four times (default), and the job will be marked as failed. Is that what you're looking for?
Or if you wanted your job to have succeeded even if not all tasks succeeded, tweak the "mapred.max.map/reduce.failures.percent" property in your job (by default it expects 0% failures, so set a number between 0-1 that is acceptable for you). To then avoid having to do it four times for a single task, lower "mapred.map/reduce.max.attempts" down from its default of 4. Does this answer your question? On Sat, Jul 21, 2012 at 2:47 AM, jay vyas <jayunit...@gmail.com> wrote: > Hi guys : I want my tasks to end/fail, but I don't want to kill my entire > hadoop job. > > I have a hadoop job that runs 5 hadoop jobs in a row. > Im on the last of those sub-jobs, and want to fail all tasks so that the > task tracker stops delegating them, > and the hadoop main job can naturally come to a close. > > However, when I run "hadoop job kill-attempt / fail-attempt ....", the > jobtracker seems to simply relaunch > the same tasks with new ids. > > How can I tell the jobtracker to give up on redelegating? -- Harsh J