[ 
https://issues.apache.org/jira/browse/SPARK-26886?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

luzengxiang resolved SPARK-26886.
---------------------------------
    Resolution: Won't Do

> Proper termination of external processes launched by the worker
> ---------------------------------------------------------------
>
>                 Key: SPARK-26886
>                 URL: https://issues.apache.org/jira/browse/SPARK-26886
>             Project: Spark
>          Issue Type: Story
>          Components: Spark Core
>    Affects Versions: 2.4.0
>            Reporter: luzengxiang
>            Priority: Minor
>
> When Embedding Deeplearning Framework in spark, spark worker has to launch 
> external process(eg. MPI task) in some cases. 
> {quote}val nothing = inputData.barrier().mapPartitions
>  {_ => 
>  val barrierTask = BarrierTaskContext.get()
>  // save data to disk barrierTask.barrier()
>  barrierTask.barrier()
>  // launch external process, eg MPI Task + TensorFlow
>  }
> {quote}
>  
> The problem is that external process remains running when spark task is 
> killed manually. This Jira is the place to talk about properly terminating 
> external processes launched by spark worker, when spark task is killed or 
> interrupt.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to