srowen commented on a change in pull request #28986:
URL: https://github.com/apache/spark/pull/28986#discussion_r461175411
##########
File path: core/src/main/scala/org/apache/spark/SparkContext.scala
##########
@@ -2554,6 +2557,19 @@ object SparkContext extends Logging {
}
}
+ /**
+ * Called to ensure that SparkContext is created or accessed only on the
Driver.
+ *
+ * Throws an exception if a SparkContext is about to be created in executors.
+ */
+ private def assertOnDriver(): Unit = {
+ if (TaskContext.get != null) {
Review comment:
Yep if that's the logic - that one might previously have harmlessly
created a SparkContext that does not work and now it fails explicitly - then
I'd say just revert it. It's just trying to fail-fast, but maybe that's not a
good idea. If it affects mlflow, I'd guess it affects N other applications.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]