Patrick Wendell created SPARK-6703:
--------------------------------------
Summary: Provide a way to discover existing SparkContext's
Key: SPARK-6703
URL: https://issues.apache.org/jira/browse/SPARK-6703
Project: Spark
Issue Type: New Feature
Components: Spark Core
Reporter: Patrick Wendell
Right now it is difficult to write a Spark application in a way that can be run
independently and also be composed with other Spark applications in an
environment such as the JobServer, notebook servers, etc where there is a
shared SparkContext.
It would be nice to have a way to write an application where you can "get or
create" a SparkContext and have some standard type of synchronization point
application authors can access. The most simple/surgical way I see to do this
is to have an optional static SparkContext singleton that people can be
retrieved as follows:
{code}
val sc = SparkContext.getOrCreate(conf = new SparkConf())
{code}
And you could also have a setter where some outer framework/server can set it
for use by multiple downstream applications.
A more advanced version of this would have some named registry or something,
but since we only support a single SparkContext in one JVM at this point
anyways, this seems sufficient and much simpler. Another advanced option would
be to allow plugging in some other notion of configuration you'd pass when
retrieving an existing context.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]