you can use
$ hadoop job -kill <jobid>

On Mon, Apr 13, 2015 at 10:20 AM, Rohith Sharma K S <
rohithsharm...@huawei.com> wrote:

>  In addition to below options, in the Hadoop-2.7(yet to release in couple
> of weeks) the user friendly option provided for killing the applications
> from Web UI.
>
>
>
> In the application block , *‘Kill Application’* button has been provided
> for killing applications.
>
>
>
> Thanks & Regards
>
> Rohith Sharma K S
>
> *From:* Pradeep Gollakota [mailto:pradeep...@gmail.com]
> *Sent:* 12 April 2015 23:41
> *To:* user@hadoop.apache.org
> *Subject:* Re: How to stop a mapreduce job from terminal running on
> Hadoop Cluster?
>
>
>
> Also, mapred job -kill <job_id>
>
>
>
> On Sun, Apr 12, 2015 at 11:07 AM, Shahab Yunus <shahab.yu...@gmail.com>
> wrote:
>
> You can kill t by using the following yarn command
>
>
>
> yarn application -kill <application id>
>
>
> https://hadoop.apache.org/docs/r2.2.0/hadoop-yarn/hadoop-yarn-site/YarnCommands.html
>
>
>
> Or use old hadoop job command
>
> http://stackoverflow.com/questions/11458519/how-to-kill-hadoop-jobs
>
>
>
> Regards,
>
> Shahab
>
>
>
> On Sun, Apr 12, 2015 at 2:03 PM, Answer Agrawal <yrsna.tse...@gmail.com>
> wrote:
>
> To run a job we use the command
> $ hadoop jar example.jar inputpath outputpath
> If job is so time taken and we want to stop it in middle then which
> command is used? Or is there any other way to do that?
>
> Thanks,
>
>
>
>
>
>
>
>
>



-- 
*Thanks & Regards *


*Unmesha Sreeveni U.B*
*Hadoop, Bigdata Developer*
*Centre for Cyber Security | Amrita Vishwa Vidyapeetham*
http://www.unmeshasreeveni.blogspot.in/

Reply via email to