HyukjinKwon commented on a change in pull request #28986:
URL: https://github.com/apache/spark/pull/28986#discussion_r462673731



##########
File path: core/src/main/scala/org/apache/spark/SparkContext.scala
##########
@@ -2554,6 +2557,19 @@ object SparkContext extends Logging {
     }
   }
 
+  /**
+   * Called to ensure that SparkContext is created or accessed only on the 
Driver.
+   *
+   * Throws an exception if a SparkContext is about to be created in executors.
+   */
+  private def assertOnDriver(): Unit = {
+    if (TaskContext.get != null) {

Review comment:
       I think it's a bit odds to have two separate configuration for Scala and 
Python. I don't think we will happen to allow this in PySpark specifically in 
the future even if it worked. I think we will discourage this behaviour anyway 
whether it is in Scala or Python, and deprecate/remove both configurations 
together eventually. In which case would we disable it in Python side but 
enable it in Python side?

##########
File path: core/src/main/scala/org/apache/spark/SparkContext.scala
##########
@@ -2554,6 +2557,19 @@ object SparkContext extends Logging {
     }
   }
 
+  /**
+   * Called to ensure that SparkContext is created or accessed only on the 
Driver.
+   *
+   * Throws an exception if a SparkContext is about to be created in executors.
+   */
+  private def assertOnDriver(): Unit = {
+    if (TaskContext.get != null) {

Review comment:
       I think it's a bit odds to have two separate configuration for Scala and 
Python. I don't think we will happen to allow this in PySpark specifically in 
the future even if it worked. I think we will discourage this behaviour anyway 
whether it is in Scala or Python, and deprecate/remove both configurations 
together eventually. In which case would we disable it in Scala side but enable 
it in Python side?

##########
File path: core/src/main/scala/org/apache/spark/SparkContext.scala
##########
@@ -2554,6 +2557,19 @@ object SparkContext extends Logging {
     }
   }
 
+  /**
+   * Called to ensure that SparkContext is created or accessed only on the 
Driver.
+   *
+   * Throws an exception if a SparkContext is about to be created in executors.
+   */
+  private def assertOnDriver(): Unit = {
+    if (TaskContext.get != null) {

Review comment:
       I see but I guess we should discourage/deprecate/remove the behaviour in 
the end - am I correct? It's a bit weird to have two configuration to control 
the same behaviour (to end users). And I think it's a bit unlikely when it 
should be disabled in Scala but enabled in Python specifically. We could just 
enable it in both sides.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to