Hi,

It appears you're running local mode (local[*] assumed) so killing
spark-shell *will* kill the one and only executor -- the driver :)

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Thu, Jul 7, 2016 at 10:27 PM, Mr rty ff <yash...@yahoo.com> wrote:
> This what I get when I run the command
> 946 sun.tools.jps.Jps -lm
> 7443 org.apache.spark.deploy.SparkSubmit --class org.apache.spark.repl.Main
> --name Spark shell spark-shell
> I don't think that shululd kill SparkSubmit  process
>
>
>
> On Thursday, July 7, 2016 9:58 PM, Jacek Laskowski <ja...@japila.pl> wrote:
>
>
> Hi,
>
> Use jps -lm and see the processes on the machine(s) to kill.
>
> Pozdrawiam,
> Jacek Laskowski
> ----
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
>
> On Wed, Jul 6, 2016 at 9:49 PM, Mr rty ff <yash...@yahoo.com.invalid> wrote:
>> Hi
>> I like to recreate this bug
>> https://issues.apache.org/jira/browse/SPARK-13979
>> They talking about stopping Spark executors. Its not clear exactly how do
>> I
>> stop the executors
>> Thanks
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>
>
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to