This is an automated email from the ASF dual-hosted git repository.
dongjoon pushed a commit to branch branch-3.1
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/branch-3.1 by this push:
new cc62c49 [SPARK-34674][CORE][K8S] Close SparkContext after the Main
method has finished
cc62c49 is described below
commit cc62c498276e61a3b54544bf21b14d91883a5531
Author: skotlov <[email protected]>
AuthorDate: Wed Apr 21 22:54:16 2021 -0700
[SPARK-34674][CORE][K8S] Close SparkContext after the Main method has
finished
### What changes were proposed in this pull request?
Close SparkContext after the Main method has finished, to allow
SparkApplication on K8S to complete.
This is fixed version of [merged and reverted
PR](https://github.com/apache/spark/pull/32081).
### Why are the changes needed?
if I don't call the method sparkContext.stop() explicitly, then a Spark
driver process doesn't terminate even after its Main method has been completed.
This behaviour is different from spark on yarn, where the manual sparkContext
stopping is not required. It looks like, the problem is in using non-daemon
threads, which prevent the driver jvm process from terminating.
So I have inserted code that closes sparkContext automatically.
### Does this PR introduce _any_ user-facing change?
No
### How was this patch tested?
Manually on the production AWS EKS environment in my company.
Closes #32283 from kotlovs/close-spark-context-on-exit-2.
Authored-by: skotlov <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
(cherry picked from commit b17a0e6931cac98cc839c047b1b5d4ea6d052009)
Signed-off-by: Dongjoon Hyun <[email protected]>
---
core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala | 9 +++++++++
1 file changed, 9 insertions(+)
diff --git a/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala
b/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala
index bb3a20d..5f89dca 100644
--- a/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala
+++ b/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala
@@ -952,6 +952,15 @@ private[spark] class SparkSubmit extends Logging {
} catch {
case t: Throwable =>
throw findCause(t)
+ } finally {
+ if (!isShell(args.primaryResource) && !isSqlShell(args.mainClass) &&
+ !isThriftServer(args.mainClass)) {
+ try {
+ SparkContext.getActive.foreach(_.stop())
+ } catch {
+ case e: Throwable => logError(s"Failed to close SparkContext: $e")
+ }
+ }
}
}
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]