actually i dont want to kill jobs.

job list now contains 5000 entries. i do have a script to asks the cluster
api and gets the json  of this list, i know i can filter on this request,
what i need is to delete from the list older jobs so in every request i
dont have to fetch 5000 jobs info

On Thu, Mar 23, 2017 at 12:33 AM, 정현진 <[email protected]> wrote:

> Alexis,
>
> If you need to kill all running jobs, you can use this bash script.
>
> ```bash
>
> # get list of job's process IDs
> JOB_LIST=$(hadoop job -list 2> /dev/null | grep job_ | awk '{print $1}')
>
> # kill all jobs
> for JOB in $JOB_LIST
> do
>         hadoop job -kill $JOB
>         echo job (${JOB}) is killed now.
> done
>
> ```
>
>
> - Hyeonjin Jung
>
> 2017. 3. 18. 오후 9:43에 "Alexis Fidalgo" <[email protected]>님이 작성:
>
>> Hello, is there any way to configure for this list (job list) to auto
>> clean or any command to clear without restart yarn?
>>
>> thanks
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: [email protected]
>> For additional commands, e-mail: [email protected]
>>
>>

Reply via email to