ueshin opened a new pull request #29278:
URL: https://github.com/apache/spark/pull/29278


   ### What changes were proposed in this pull request?
   
   This is a follow-up of #28986.
   This PR adds configs to switch allow/disallow to create SparkContext in 
executors.
   
   - `spark.driver.allowSparkContextInExecutors` for Scala/Java
   - `spark.python.allowSparkContextInExecutors` for Python
   
   ### Why are the changes needed?
   
   Some users or libraries actually create `SparkContext` in executors.
   We shouldn't break their workloads.
   
   ### Does this PR introduce _any_ user-facing change?
   
   Yes, users will be able to create `SparkContext` in executors with the 
configs enabled.
   
   ### How was this patch tested?
   
   More tests are added.
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to