[ https://issues.apache.org/jira/browse/SPARK-25355?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17531628#comment-17531628 ]
Shrikant commented on SPARK-25355: ---------------------------------- We have to use the --proxy-user arg.So need to get Option 1 working. In the client log, it shows that token was created for proxy user: {code:java} 22/05/04 04:13:07 INFO DFSClient: Created token for proxyUser: HDFS_DELEGATION_TOKEN owner=proxyUser, renewer=proxyUser, realUser=<user>/<t...@domain.com>, issueDate=1651637587347, maxDate=1652242387347, sequenceNumber=183545, masterKeyId=606 on ha-hdfs:<hdfs>{code} But how will this token be available on the driver pod? (IllegalArgumentException: Empty cookie header string) -> It's not supposed to have any impact: https://issues.apache.org/jira/browse/HDFS-15136 > Support --proxy-user for Spark on K8s > ------------------------------------- > > Key: SPARK-25355 > URL: https://issues.apache.org/jira/browse/SPARK-25355 > Project: Spark > Issue Type: Sub-task > Components: Kubernetes, Spark Core > Affects Versions: 3.1.0 > Reporter: Stavros Kontopoulos > Assignee: Pedro Rossi > Priority: Major > Fix For: 3.1.0 > > Attachments: client.log, driver.log > > > SPARK-23257 adds kerberized hdfs support for Spark on K8s. A major addition > needed is the support for proxy user. A proxy user is impersonated by a > superuser who executes operations on behalf of the proxy user. More on this: > [https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/Superusers.html] > [https://github.com/spark-notebook/spark-notebook/blob/master/docs/proxyuser_impersonation.md] > This has been implemented for Yarn upstream and Spark on Mesos here: > [https://github.com/mesosphere/spark/pull/26] > [~ifilonenko] creating this issue according to our discussion. -- This message was sent by Atlassian Jira (v8.20.7#820007) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org