Github user andrewor14 commented on a diff in the pull request:
https://github.com/apache/spark/pull/3121#discussion_r19987234
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -179,6 +182,30 @@ class SparkContext(config: SparkConf) extends
SparkStatusAPI with Logging {
conf.setIfMissing("spark.driver.host", Utils.localHostName())
conf.setIfMissing("spark.driver.port", "0")
+ // This is placed after the configuration validation so that common
configuration errors, like
+ // forgetting to pass a master url or app name, don't prevent subsequent
SparkContexts from being
+ // constructed.
+ SparkContext.SPARK_CONTEXT_CONSTRUCTOR_LOCK.synchronized {
+ SparkContext.activeSparkContextCreationSite.foreach { creationSite =>
+ val errMsg = "Only one SparkContext may be active in this JVM (see
SPARK-2243)."
+ val errDetails = if
(SparkContext.activeSparkContextIsFullyConstructed) {
+ s"The currently active SparkContext was created
at:\n${creationSite.longForm}"
+ } else {
+ s"Another SparkContext is either being constructed or threw an
exception from its" +
+ " constructor; please restart your JVM in order to create a new
SparkContext." +
+ s"The current SparkContext was created
at:\n${creationSite.longForm}"
+ }
+ val exception = new SparkException(s"$errMsg $errDetails")
+ if
(conf.getBoolean("spark.driver.disableMultipleSparkContextsErrorChecking",
false)) {
--- End diff --
Here I prefer something more concise `spark.driver.disallowMultipleContexts`
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]