Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21669#discussion_r223861857
  
    --- Diff: docs/security.md ---
    @@ -722,7 +722,84 @@ with encryption, at least.
     The Kerberos login will be periodically renewed using the provided 
credentials, and new delegation
     tokens for supported will be created.
     
    +## Secure Interaction with Kubernetes
    +
    +When talking to Hadoop-based services behind Kerberos, it was noted that 
Spark needs to obtain delegation tokens
    +so that non-local processes can authenticate. These delegation tokens in 
Kubernetes are stored in Secrets that are 
    +shared by the Driver and its Executors. As such, there are three ways of 
submitting a Kerberos job: 
    +
    +In all cases you must define the environment variable: `HADOOP_CONF_DIR` 
or 
    +`spark.kubernetes.hadoop.configMapName` as well as either
    +`spark.kubernetes.kerberos.krb5.location` or 
`spark.kubernetes.kerberos.krb5.configMapName`.
    +
    +It also important to note that the KDC needs to be visible from inside the 
containers if the user uses a local
    --- End diff --
    
    Not just "if the user uses a local krb5 file".


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to