Thanks Park. I am doing the same. Was trying to understand if there are other 
ways.

Thanks,
Pradeep

> On Aug 2, 2016, at 10:25 PM, Park Kyeong Hee <kh1979.p...@samsung.com> wrote:
> 
> So sorry. Your name was Pradeep !!
> 
> -----Original Message-----
> From: Park Kyeong Hee [mailto:kh1979.p...@samsung.com] 
> Sent: Wednesday, August 03, 2016 11:24 AM
> To: 'Pradeep'; 'user@spark.apache.org'
> Subject: RE: Stop Spark Streaming Jobs
> 
> Hi. Paradeep
> 
> 
> Did you mean, how to kill the job?
> If yes, you should kill the driver and follow next.
> 
> on yarn-client
> 1. find pid - "ps -es | grep <your_jobs_main_class>"
> 2. kill it - "kill -9 <pid>"
> 3. check executors were down - "yarn application -list"
> 
> on yarn-cluster
> 1. find driver's application ID - "yarn application -list"
> 2. stop it - "yarn application -kill <app_ID>"
> 3. check driver and executors were down - "yarn application -list"
> 
> 
> Thanks.
> 
> -----Original Message-----
> From: Pradeep [mailto:pradeep.mi...@mail.com] 
> Sent: Wednesday, August 03, 2016 10:48 AM
> To: user@spark.apache.org
> Subject: Stop Spark Streaming Jobs
> 
> Hi All,
> 
> My streaming job reads data from Kafka. The job is triggered and pushed to
> background with nohup.
> 
> What are the recommended ways to stop job either on yarn-client or cluster
> mode.
> 
> Thanks,
> Pradeep
> 
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
> 
> 
> 
> 
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
> 


---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to