I do this in my stop script to kill the application: kill -s SIGTERM `pgrep
-f StreamingApp`
to stop it forcefully : pkill -9 -f "StreamingApp"
StreamingApp is name of class which I submitted.

I also have shutdown hook thread to stop it gracefully.

sys.ShutdownHookThread {
  logInfo("Gracefully stopping StreamingApp")
  ssc.stop(true, true)
  logInfo("StreamingApp stopped")
}

I am also not able to kill application from sparkUI.


On Sat, Nov 21, 2015 at 11:32 AM, Vikram Kone <vikramk...@gmail.com> wrote:

> I tried adding shutdown hook to my code but it didn't help. Still same
> issue
>
>
> On Fri, Nov 20, 2015 at 7:08 PM, Ted Yu <yuzhih...@gmail.com> wrote:
>
>> Which Spark release are you using ?
>>
>> Can you pastebin the stack trace of the process running on your machine ?
>>
>> Thanks
>>
>> On Nov 20, 2015, at 6:46 PM, Vikram Kone <vikramk...@gmail.com> wrote:
>>
>> Hi,
>> I'm seeing a strange problem. I have a spark cluster in standalone mode.
>> I submit spark jobs from a remote node as follows from the terminal
>>
>> spark-submit --master spark://10.1.40.18:7077  --class com.test.Ping
>> spark-jobs.jar
>>
>> when the app is running , when I press ctrl-C on the console terminal,
>> then the process is killed and so is the app in the spark master UI. When I
>> go to spark master ui, i see that this app is in state Killed under
>> Completed applications, which is what I expected to see.
>>
>> Now, I created a shell script as follows to do the same
>>
>> #!/bin/bash
>> spark-submit --master spark://10.1.40.18:7077  --class com.test.Ping
>> spark-jobs.jar
>> echo $! > my.pid
>>
>> When I execute the shell script from terminal, as follows
>>
>> $> bash myscript.sh
>>
>> The application is submitted correctly to spark master and I can see it
>> as one of the running apps in teh spark master ui. But when I kill the
>> process in my terminal as follows
>>
>> $> ps kill $(cat my.pid)
>>
>> I see that the process is killed on my machine but the spark appliation
>> is still running in spark master! It doesn't get killed.
>>
>> I noticed one more thing that, when I launch the spark job via shell
>> script and kill the application from spark master UI by clicking on "kill"
>> next to the running application, it gets killed in spark ui but I still see
>> the process running in my machine.
>>
>> In both cases, I would expect the remote spark app to be killed and my
>> local process to be killed.
>>
>> Why is this happening? and how can I kill a spark app from the terminal
>> launced via shell script w.o going to the spark master UI?
>>
>> I want to launch the spark app via script and log the pid so i can
>> monitor it remotely
>>
>> thanks for the help
>>
>>
>


-- 
*VARUN SHARMA*
*Flipkart*
*Bangalore*

Reply via email to