This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-4.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-4.0 by this push:
     new a35d9f3ad98d [SPARK-51956][K8S] Fix `KerberosConfDriverFeatureStep` to 
warn in case of failures
a35d9f3ad98d is described below

commit a35d9f3ad98d4b0b9b31e43cec51b7f7cf5ddff0
Author: Dongjoon Hyun <dongj...@apache.org>
AuthorDate: Tue Apr 29 18:01:45 2025 -0700

    [SPARK-51956][K8S] Fix `KerberosConfDriverFeatureStep` to warn in case of 
failures
    
    ### What changes were proposed in this pull request?
    
    This PR aims to fix `KerberosConfDriverFeatureStep` to warn in case of 
failures and continue.
    
    ### Why are the changes needed?
    
    `DelegationTokenProvider.obtainDelegationTokens` functions are designed to 
warn in case of failures.
    
    
https://github.com/apache/spark/blob/54eb1a2f863bd7d8706c5c9a568895adb026c78d/sql/hive/src/main/scala/org/apache/spark/sql/hive/security/HiveDelegationTokenProvider.scala#L115-L121
    
    
https://github.com/apache/spark/blob/54eb1a2f863bd7d8706c5c9a568895adb026c78d/core/src/main/scala/org/apache/spark/deploy/security/HBaseDelegationTokenProvider.scala#L100-L101
    
    `KerberosConfDriverFeatureStep` had better follow the behavior during 
getting credentials and obtaining delegation tokens instead of failing at job 
submission.
    
    
https://github.com/apache/spark/blob/54eb1a2f863bd7d8706c5c9a568895adb026c78d/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/KerberosConfDriverFeatureStep.scala#L94-L95
    
    ```
    Failed to request driver from scheduler backend. StackTrace:  ...
    at ....KerberosConfDriverFeatureStep.delegationTokens$lzycompute 
(KerberosConfDriverFeatureStep.scala:94)
    at ....KerberosConfDriverFeatureStep$$delegationTokens 
(KerberosConfDriverFeatureStep.scala:90)
    ```
    
    ### Does this PR introduce _any_ user-facing change?
    
    Previously `KerberosConfDriverFeatureStep` fails if there are exceptions. 
Now, it will continue to next steps.
    
    ### How was this patch tested?
    
    It's a little difficult to write a test case.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #50758 from dongjoon-hyun/SPARK-51956.
    
    Authored-by: Dongjoon Hyun <dongj...@apache.org>
    Signed-off-by: Dongjoon Hyun <dongj...@apache.org>
    (cherry picked from commit 4445cd8c0c265b65209b2df9d637057aa40a7b20)
    Signed-off-by: Dongjoon Hyun <dongj...@apache.org>
---
 .../features/KerberosConfDriverFeatureStep.scala   | 23 ++++++++++++++--------
 1 file changed, 15 insertions(+), 8 deletions(-)

diff --git 
a/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/KerberosConfDriverFeatureStep.scala
 
b/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/KerberosConfDriverFeatureStep.scala
index 89aefe47e46d..bd591b39de01 100644
--- 
a/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/KerberosConfDriverFeatureStep.scala
+++ 
b/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/KerberosConfDriverFeatureStep.scala
@@ -20,6 +20,7 @@ import java.io.File
 import java.nio.charset.StandardCharsets
 
 import scala.jdk.CollectionConverters._
+import scala.util.control.NonFatal
 
 import com.google.common.io.Files
 import io.fabric8.kubernetes.api.model._
@@ -91,14 +92,20 @@ private[spark] class 
KerberosConfDriverFeatureStep(kubernetesConf: KubernetesDri
     if (keytab.isEmpty && existingSecretName.isEmpty) {
       val tokenManager = new 
HadoopDelegationTokenManager(kubernetesConf.sparkConf,
         SparkHadoopUtil.get.newConfiguration(kubernetesConf.sparkConf), null)
-      val creds = UserGroupInformation.getCurrentUser().getCredentials()
-      tokenManager.obtainDelegationTokens(creds)
-      // If no tokens and no secrets are stored in the credentials, make sure 
nothing is returned,
-      // to avoid creating an unnecessary secret.
-      if (creds.numberOfTokens() > 0 || creds.numberOfSecretKeys() > 0) {
-        SparkHadoopUtil.get.serialize(creds)
-      } else {
-        null
+      try {
+        val creds = UserGroupInformation.getCurrentUser().getCredentials()
+        tokenManager.obtainDelegationTokens(creds)
+        // If no tokens and no secrets are stored in the credentials, make 
sure nothing is returned,
+        // to avoid creating an unnecessary secret.
+        if (creds.numberOfTokens() > 0 || creds.numberOfSecretKeys() > 0) {
+          SparkHadoopUtil.get.serialize(creds)
+        } else {
+          null
+        }
+      } catch {
+        case NonFatal(e) =>
+          logWarning("Fail to get credentials", e)
+          null
       }
     } else {
       null


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to