[
https://issues.apache.org/jira/browse/SPARK-13904?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15195007#comment-15195007
]
Apache Spark commented on SPARK-13904:
--------------------------------------
User 'hbhanawat' has created a pull request for this issue:
https://github.com/apache/spark/pull/11723
> Add support for pluggable cluster manager
> -----------------------------------------
>
> Key: SPARK-13904
> URL: https://issues.apache.org/jira/browse/SPARK-13904
> Project: Spark
> Issue Type: Improvement
> Components: Scheduler
> Reporter: Hemant Bhanawat
>
> Currently Spark allows only a few cluster managers viz Yarn, Mesos and
> Standalone. But, as Spark is now being used in newer and different use cases,
> there is a need for allowing other cluster managers to manage spark
> components. One such use case is - embedding spark components like executor
> and driver inside another process which may be a datastore. This allows
> colocation of data and processing. Another requirement that stems from such a
> use case is that the executors/driver should not take the parent process down
> when they go down and the components can be relaunched inside the same
> process again.
> So, this JIRA requests two functionalities:
> 1. Support for external cluster managers
> 2. Allow a cluster manager to clean up the tasks without taking the parent
> process down.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]