ueshin commented on a change in pull request #28986:
URL: https://github.com/apache/spark/pull/28986#discussion_r449319134



##########
File path: core/src/main/scala/org/apache/spark/SparkContext.scala
##########
@@ -2554,6 +2557,19 @@ object SparkContext extends Logging {
     }
   }
 
+  /**
+   * Called to ensure that SparkContext is created or accessed only on the 
Driver.
+   *
+   * Throws an exception if a SparkContext is about to be created in executors.
+   */
+  private[spark] def assertOnDriver(): Unit = {
+    if (TaskContext.get != null) {

Review comment:
       Under local mode:
   
   ```
   scala> sc.range(0, 1).foreach { _ => new SparkContext(new 
SparkConf().setAppName("test").setMaster("local")) }
   java.lang.IllegalStateException: SparkContext should only be created and 
accessed on the driver.
   ...
   ```
   
   before this patch:
   
   ```
   scala> sc.range(0, 1).foreach { _ => new SparkContext(new 
SparkConf().setAppName("test").setMaster("local")) }
   org.apache.spark.SparkException: Only one SparkContext should be running in 
this JVM (see SPARK-2243).The currently running SparkContext was created at:
   org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:921)
   ...
   ```
   
   Although the exception is different, it fails anyway.
   
   I think the new error message is more reasonable.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to