[ 
https://issues.apache.org/jira/browse/SPARK-1740?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Josh Rosen resolved SPARK-1740.
-------------------------------

       Resolution: Fixed
    Fix Version/s: 1.1.0

> Pyspark cancellation kills unrelated pyspark workers
> ----------------------------------------------------
>
>                 Key: SPARK-1740
>                 URL: https://issues.apache.org/jira/browse/SPARK-1740
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 1.0.0
>            Reporter: Aaron Davidson
>            Assignee: Davies Liu
>            Priority: Critical
>             Fix For: 1.1.0
>
>
> PySpark cancellation calls SparkEnv#destroyPythonWorker. Since there is one 
> python worker per process, this would seem like a sensible thing to do. 
> Unfortunately, this method actually destroys a python daemon, and all 
> associated workers, which generally means that we can cause failures in 
> unrelated Pyspark jobs.
> The severity of this bug is limited by the fact that the Pyspark daemon is 
> easily recreated, so the tasks will succeed after being restarted.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to