[ 
https://issues.apache.org/jira/browse/SPARK-4180?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Josh Rosen resolved SPARK-4180.
-------------------------------
          Resolution: Fixed
       Fix Version/s:     (was: 1.2.1)
                      1.2.0
    Target Version/s: 1.2.0  (was: 1.2.0, 1.0.3, 1.1.2)

I'm going to resolve this as fixed since it was included in 1.2.0.  Now that 
we're about to release 1.3, I don't think that we need to backport this into 
branch-1.0, so I'm going to remove the {{backport-needed}} label.

> SparkContext constructor should throw exception if another SparkContext is 
> already running
> ------------------------------------------------------------------------------------------
>
>                 Key: SPARK-4180
>                 URL: https://issues.apache.org/jira/browse/SPARK-4180
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>            Reporter: Josh Rosen
>            Assignee: Josh Rosen
>            Priority: Blocker
>             Fix For: 1.3.0, 1.2.0
>
>
> Spark does not currently support multiple concurrently-running SparkContexts 
> in the same JVM (see SPARK-2243).  Therefore, SparkContext's constructor 
> should throw an exception if there is an active SparkContext that has not 
> been shut down via {{stop()}}.
> PySpark already does this, but the Scala SparkContext should do the same 
> thing.  The current behavior with multiple active contexts is unspecified / 
> not understood and it may be the source of confusing errors (see the user 
> error report in SPARK-4080, for example).
> This should be pretty easy to add: just add a {{activeSparkContext}} field to 
> the SparkContext companion object and {{synchronize}} on it in the 
> constructor and {{stop()}} methods; see PySpark's {{context.py}} file for an 
> example of this approach.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to