[
https://issues.apache.org/jira/browse/SPARK-6703?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Apache Spark reassigned SPARK-6703:
-----------------------------------
Assignee: Ilya Ganelin (was: Apache Spark)
> Provide a way to discover existing SparkContext's
> -------------------------------------------------
>
> Key: SPARK-6703
> URL: https://issues.apache.org/jira/browse/SPARK-6703
> Project: Spark
> Issue Type: New Feature
> Components: Spark Core
> Affects Versions: 1.3.0
> Reporter: Patrick Wendell
> Assignee: Ilya Ganelin
> Priority: Critical
>
> Right now it is difficult to write a Spark application in a way that can be
> run independently and also be composed with other Spark applications in an
> environment such as the JobServer, notebook servers, etc where there is a
> shared SparkContext.
> It would be nice to provide a rendez-vous point so that applications can
> learn whether an existing SparkContext already exists before creating one.
> The most simple/surgical way I see to do this is to have an optional static
> SparkContext singleton that people can be retrieved as follows:
> {code}
> val sc = SparkContext.getOrCreate(conf = new SparkConf())
> {code}
> And you could also have a setter where some outer framework/server can set it
> for use by multiple downstream applications.
> A more advanced version of this would have some named registry or something,
> but since we only support a single SparkContext in one JVM at this point
> anyways, this seems sufficient and much simpler. Another advanced option
> would be to allow plugging in some other notion of configuration you'd pass
> when retrieving an existing context.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]