This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/spark-kubernetes-operator.git


The following commit(s) were added to refs/heads/main by this push:
     new de68dd6  [SPARK-55537] Check `spark.dynamicAllocation.enabled` before 
overriding deleteOnTermination
de68dd6 is described below

commit de68dd64c7e8c9a3f0c1dd2ad037addcaeca2931
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Sun Feb 15 22:19:46 2026 -0800

    [SPARK-55537] Check `spark.dynamicAllocation.enabled` before overriding 
deleteOnTermination
    
    ### What changes were proposed in this pull request?
    
    This PR aims to check `spark.dynamicAllocation.enabled` before overriding 
deleteOnTermination.
    
    ### Why are the changes needed?
    
    For `Dynamic Allocation`, we had better respect the default clean-up 
behavior.
    - #484
    
    ### Does this PR introduce _any_ user-facing change?
    
    Yes, but SPARK-55352 is not released yet.
    
    ### How was this patch tested?
    
    Pass the CIs.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    Generated-by: `Gemini 3 Pro (High)` on `Antigravity`
    
    Closes #501 from dongjoon-hyun/SPARK-55537.
    
    Authored-by: Dongjoon Hyun <[email protected]>
    Signed-off-by: Dongjoon Hyun <[email protected]>
---
 .../java/org/apache/spark/k8s/operator/SparkAppSubmissionWorker.java | 5 +++--
 1 file changed, 3 insertions(+), 2 deletions(-)

diff --git 
a/spark-submission-worker/src/main/java/org/apache/spark/k8s/operator/SparkAppSubmissionWorker.java
 
b/spark-submission-worker/src/main/java/org/apache/spark/k8s/operator/SparkAppSubmissionWorker.java
index 6f38f90..8250e1b 100644
--- 
a/spark-submission-worker/src/main/java/org/apache/spark/k8s/operator/SparkAppSubmissionWorker.java
+++ 
b/spark-submission-worker/src/main/java/org/apache/spark/k8s/operator/SparkAppSubmissionWorker.java
@@ -168,8 +168,9 @@ public class SparkAppSubmissionWorker {
     effectiveSparkConf.setIfMissing("spark.app.id", appId);
     effectiveSparkConf.setIfMissing("spark.authenticate", "true");
     effectiveSparkConf.setIfMissing("spark.io.encryption.enabled", "true");
-    // Use K8s Garbage Collection instead of explicit API invocations
-    if (applicationSpec.getApplicationTolerations().getResourceRetainPolicy() 
!=
+    // In case of static allocation, use K8s Garbage Collection instead of 
explicit API invocations
+    if 
(!"true".equalsIgnoreCase(effectiveSparkConf.get("spark.dynamicAllocation.enabled",
 "false"))
+        && 
applicationSpec.getApplicationTolerations().getResourceRetainPolicy() !=
         ResourceRetainPolicy.Always) {
       
effectiveSparkConf.setIfMissing("spark.kubernetes.executor.deleteOnTermination",
 "false");
     }


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to