[
https://issues.apache.org/jira/browse/SPARK-26886?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16769165#comment-16769165
]
luzengxiang commented on SPARK-26886:
-------------------------------------
[~mengxr] Let's discuss about it.
> Proper termination of external processes launched by the worker
> ---------------------------------------------------------------
>
> Key: SPARK-26886
> URL: https://issues.apache.org/jira/browse/SPARK-26886
> Project: Spark
> Issue Type: New JIRA Project
> Components: Spark Core
> Affects Versions: 2.4.0
> Reporter: luzengxiang
> Priority: Minor
>
> When Embedding Deeplearning Framework in spark, spark worker has to launch
> external process(eg. MPI task) in some cases.
> {quote}val nothing = inputData.barrier().mapPartitions
> {_ =>
> val barrierTask = BarrierTaskContext.get()
> // save data to disk barrierTask.barrier()
> barrierTask.barrier()
> // launch external process, eg MPI Task + TensorFlow
> }
> {quote}
>
> The problem is that external process remains running when spark task is
> killed manually. This Jira is the place to talk about properly terminating
> external processes launched by spark worker, when spark task is killed or
> interrupt.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]