Github user rvesse commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21669#discussion_r204403925
  
    --- Diff: 
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/submit/KubernetesClientApplication.scala
 ---
    @@ -107,7 +109,14 @@ private[spark] class Client(
       def run(): Unit = {
         val resolvedDriverSpec = builder.buildFromFeatures(kubernetesConf)
         val configMapName = s"$kubernetesResourceNamePrefix-driver-conf-map"
    -    val configMap = buildConfigMap(configMapName, 
resolvedDriverSpec.systemProperties)
    +    val isKerberosEnabled = 
kubernetesConf.getTokenManager.isSecurityEnabled
    +    // HADOOP_SECURITY_AUTHENTICATION is defined as simple for the driver 
and executors as
    +    // they need only the delegation token to access secure HDFS, no need 
to sign in to Kerberos
    +    val maybeSimpleAuthentication =
    +      if (isKerberosEnabled) Some((s"-D$HADOOP_SECURITY_AUTHENTICATION", 
"simple")) else None
    --- End diff --
    
    I am unsure as to why this is needed.  When delegation tokens are available 
in the containers they should get used in preference to the Kerberos login 
anyway by the Hadoop UGI machinery.  And if you want to run applications that 
genuinely need to do a Kerberos login e.g. Spark Thrift Server then this 
prevents that


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to