This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/spark-kubernetes-operator.git


The following commit(s) were added to refs/heads/main by this push:
     new 487da1e  [SPARK-55095] Enable `spark.io.encryption.enabled` by default
487da1e is described below

commit 487da1e5e20e53173259fe30c3aa15a1f576d32e
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Tue Jan 20 17:11:56 2026 +0900

    [SPARK-55095] Enable `spark.io.encryption.enabled` by default
    
    ### What changes were proposed in this pull request?
    
    This PR aims to enable local disk I/O encryption by default via 
`spark.io.encryption.enabled=true`.
    
    ### Why are the changes needed?
    
    To improve Apache Spark application security by default.
    
    ### Does this PR introduce _any_ user-facing change?
    
    Spark will encrypt temporary data written to local disks. This covers 
shuffle files, shuffle spills and data blocks stored on disk (for both caching 
and broadcast variables).
    
    ### How was this patch tested?
    
    Manually.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #467 from dongjoon-hyun/SPARK-55095.
    
    Authored-by: Dongjoon Hyun <[email protected]>
    Signed-off-by: Dongjoon Hyun <[email protected]>
---
 .../java/org/apache/spark/k8s/operator/SparkAppSubmissionWorker.java     | 1 +
 1 file changed, 1 insertion(+)

diff --git 
a/spark-submission-worker/src/main/java/org/apache/spark/k8s/operator/SparkAppSubmissionWorker.java
 
b/spark-submission-worker/src/main/java/org/apache/spark/k8s/operator/SparkAppSubmissionWorker.java
index 91f4289..7b13e92 100644
--- 
a/spark-submission-worker/src/main/java/org/apache/spark/k8s/operator/SparkAppSubmissionWorker.java
+++ 
b/spark-submission-worker/src/main/java/org/apache/spark/k8s/operator/SparkAppSubmissionWorker.java
@@ -166,6 +166,7 @@ public class SparkAppSubmissionWorker {
     String appId = generateSparkAppId(app);
     effectiveSparkConf.setIfMissing("spark.app.id", appId);
     effectiveSparkConf.setIfMissing("spark.authenticate", "true");
+    effectiveSparkConf.setIfMissing("spark.io.encryption.enabled", "true");
     return SparkAppDriverConf.create(
         effectiveSparkConf,
         sparkVersion,


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to