Josh Rosen created SPARK-4180: --------------------------------- Summary: SparkContext constructor should throw exception if another SparkContext is already running Key: SPARK-4180 URL: https://issues.apache.org/jira/browse/SPARK-4180 Project: Spark Issue Type: Bug Components: Spark Core Reporter: Josh Rosen Priority: Blocker
Spark does not currently support multiple concurrently-running SparkContexts in the same JVM (see SPARK-2243). Therefore, SparkContext's constructor should throw an exception if there is an active SparkContext that has not been shut down via {{stop()}}. PySpark already does this, but the Scala SparkContext should do the same thing. The current behavior with multiple active contexts is unspecified / not understood and it may be the source of confusing errors (see SPARK-4080, for example). This should be pretty easy to add: just add a {{activeSparkContext}} field to the SparkContext companion object and {{synchronize}} on it in the constructor and {{stop()}} methods; see PySpark's {{context.py}} file for an example of this approach. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org