Re: Stop Spark Streaming Jobs

2016-08-04 Thread Sandeep Nemuri
gt; On Aug 2, 2016, at 10:25 PM, Park Kyeong Hee <kh1979.p...@samsung.com> >>> wrote: >>> > >>> > So sorry. Your name was Pradeep !! >>> > >>> > -Original Message- >>> > From: Park Kyeong Hee [mailto:kh1979.p...@samsung.com] >>&g

Re: Stop Spark Streaming Jobs

2016-08-04 Thread Sandeep Nemuri
So sorry. Your name was Pradeep !! >> > >> > -Original Message- >> > From: Park Kyeong Hee [mailto:kh1979.p...@samsung.com] >> > Sent: Wednesday, August 03, 2016 11:24 AM >> > To: 'Pradeep'; 'user@spark.apache.org' >> > Subject: RE: S

Re: Stop Spark Streaming Jobs

2016-08-03 Thread Tony Lane
g Hee <kh1979.p...@samsung.com> > wrote: > > > > So sorry. Your name was Pradeep !! > > > > -Original Message- > > From: Park Kyeong Hee [mailto:kh1979.p...@samsung.com] > > Sent: Wednesday, August 03, 2016 11:24 AM > > To: 'Pradeep'; 'user@

Re: Stop Spark Streaming Jobs

2016-08-02 Thread Pradeep
ong Hee [mailto:kh1979.p...@samsung.com] > Sent: Wednesday, August 03, 2016 11:24 AM > To: 'Pradeep'; 'user@spark.apache.org' > Subject: RE: Stop Spark Streaming Jobs > > Hi. Paradeep > > > Did you mean, how to kill the job? > If yes, you should kill the driver and follo

RE: Stop Spark Streaming Jobs

2016-08-02 Thread Park Kyeong Hee
Hi. Paradeep Did you mean, how to kill the job? If yes, you should kill the driver and follow next. on yarn-client 1. find pid - "ps -es | grep " 2. kill it - "kill -9 " 3. check executors were down - "yarn application -list" on yarn-cluster 1. find driver's application ID - "yarn application