Interesting, SPARK-3090 installs shutdown hook for stopping SparkContext.

FYI

On Fri, Nov 20, 2015 at 7:12 PM, Stéphane Verlet <kaweahsoluti...@gmail.com>
wrote:

> I solved the first issue by adding a shutdown hook in my code. The
> shutdown hook get call when you exit your script (ctrl-C , kill … but nor
> kill -9)
>
> val shutdownHook = scala.sys.addShutdownHook {
> try {
>
>         sparkContext.stop()
> //Make sure to kill any other threads or thread pool you may be running
>       }
>       catch {
>         case e: Exception =>
>           {
>             ...
>
>           }
>       }
>
>     }
>
> For the other issue , kill from the UI. I also had the issue. This was
> caused by a thread pool that I use.
>
> So I surrounded my code with try/finally block to guarantee that the
> thread pool was shutdown when spark stopped
>
> I hopes this help
>
> Stephane
> ​
>
> On Fri, Nov 20, 2015 at 7:46 PM, Vikram Kone <vikramk...@gmail.com> wrote:
>
>> Hi,
>> I'm seeing a strange problem. I have a spark cluster in standalone mode.
>> I submit spark jobs from a remote node as follows from the terminal
>>
>> spark-submit --master spark://10.1.40.18:7077  --class com.test.Ping
>> spark-jobs.jar
>>
>> when the app is running , when I press ctrl-C on the console terminal,
>> then the process is killed and so is the app in the spark master UI. When I
>> go to spark master ui, i see that this app is in state Killed under
>> Completed applications, which is what I expected to see.
>>
>> Now, I created a shell script as follows to do the same
>>
>> #!/bin/bash
>> spark-submit --master spark://10.1.40.18:7077  --class com.test.Ping
>> spark-jobs.jar
>> echo $! > my.pid
>>
>> When I execute the shell script from terminal, as follows
>>
>> $> bash myscript.sh
>>
>> The application is submitted correctly to spark master and I can see it
>> as one of the running apps in teh spark master ui. But when I kill the
>> process in my terminal as follows
>>
>> $> ps kill $(cat my.pid)
>>
>> I see that the process is killed on my machine but the spark appliation
>> is still running in spark master! It doesn't get killed.
>>
>> I noticed one more thing that, when I launch the spark job via shell
>> script and kill the application from spark master UI by clicking on "kill"
>> next to the running application, it gets killed in spark ui but I still see
>> the process running in my machine.
>>
>> In both cases, I would expect the remote spark app to be killed and my
>> local process to be killed.
>>
>> Why is this happening? and how can I kill a spark app from the terminal
>> launced via shell script w.o going to the spark master UI?
>>
>> I want to launch the spark app via script and log the pid so i can
>> monitor it remotely
>>
>> thanks for the help
>>
>>
>

Reply via email to