zsxwing commented on a change in pull request #28986:
URL: https://github.com/apache/spark/pull/28986#discussion_r462679180
##########
File path: core/src/main/scala/org/apache/spark/SparkContext.scala
##########
@@ -2554,6 +2557,19 @@ object SparkContext extends Logging {
}
}
+ /**
+ * Called to ensure that SparkContext is created or accessed only on the
Driver.
+ *
+ * Throws an exception if a SparkContext is about to be created in executors.
+ */
+ private def assertOnDriver(): Unit = {
+ if (TaskContext.get != null) {
Review comment:
> In which case would we disable it in Scala side but enable it in
Python side?
Like how MLflow uses it today. It's pretty hard to make it work correctly in
Scala side. But it's working in Python today.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]