attilapiros opened a new pull request, #39775:
URL: https://github.com/apache/spark/pull/39775

   ### What changes were proposed in this pull request?
   
   Introducing a config to close all active SparkContexts after the Main method 
has finished.
   
   ### Why are the changes needed?
   
   We run into errors after upgrading from Spark 3.1 to Spark 3.2 as the 
SparkContext get closed right after the starting of the application. It turned 
out the root cause is 
[SPARK-34674](https://issues.apache.org/jira/browse/SPARK-34674) which 
introduced the closing of the SparkContexts after the Main method has finished. 
   
   This application was a spark job server built on top of springboot so all 
the job submits were outside of the main method.
   
   ### Does this PR introduce _any_ user-facing change?
   
   With the current default (false) the behaviour will be the same as was on 
Spark 3.1. 
   
   ### How was this patch tested?
   
   Manually.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to