I don't think that's possible at the moment, mainly because
SparkSubmit expects it to be run from the command line, and not
programatically, so it doesn't return anything that can be used to
control what's going on. You may try to interrupt the thread calling
into SparkSubmit, but that might not work - especially if the app
doesn't handle it correctly.

Another thing to consider is that Spark itself doesn't play well with
multiple contexts running in the same JVM, so that would have to be
fixed before having SparkSubmit support that kind of use case.

Have you thought about spawning a child process to run SparkSubmit?
Then you can kill the underlying process if you need to.


On Thu, Sep 4, 2014 at 2:17 PM, randomuser54 <talktorohi...@gmail.com> wrote:
> I have a java class which calls SparkSubmit.scala with all the arguments to
> run a spark job in a thread. I am running them in local mode for now but
> also want to run them in yarn-cluster mode later.
>
> Now, I want to kill the running spark job (which can be in local or
> yarn-cluster mode) programmatically.
>
> I know that SparkContext has a stop() method but from the thread from which
> I am calling the SparkSubmit I don’t have access to it. Can someone suggest
> me how to do this properly ?
>
> Thanks.
>
>
>
>
> --
> View this message in context: 
> http://apache-spark-developers-list.1001551.n3.nabble.com/How-to-kill-a-Spark-job-running-in-local-mode-programmatically-tp8279.html
> Sent from the Apache Spark Developers List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to