[
https://issues.apache.org/jira/browse/SPARK-27891?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17898051#comment-17898051
]
Yongjun Zhang commented on SPARK-27891:
---------------------------------------
HI [~vanzin] and other folks, thanks for the discussion here.
I have a question:
https://issues.apache.org/jira/browse/SPARK-27891?focusedCommentId=16852392&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-16852392
My understanding is, passing the keytab is to help apps that run longer than 7
days (so that spark can get a new DT after 7-day hard expiration), but for apps
running shorter than 7 days, spark would renew the DT at the 24 hours internal
(default) without the keytab. is that right?
Thanks.
> Long running spark jobs fail because of HDFS delegation token expires
> ---------------------------------------------------------------------
>
> Key: SPARK-27891
> URL: https://issues.apache.org/jira/browse/SPARK-27891
> Project: Spark
> Issue Type: Bug
> Components: Security
> Affects Versions: 2.0.1, 2.1.0, 2.3.1, 2.4.1
> Reporter: hemshankar sahu
> Priority: Critical
> Attachments: application_1559242207407_0001.log,
> spark_2.3.1_failure.log
>
>
> When the spark job runs on a secured cluster for longer then time that is
> mentioned in the dfs.namenode.delegation.token.renew-interval property of
> hdfs-site.xml the spark job fails. **
> Following command was used to submit the spark job
> bin/spark-submit --principal acekrbuser --keytab ~/keytabs/acekrbuser.keytab
> --master yarn --deploy-mode cluster examples/src/main/python/wordcount.py
> /tmp/ff1.txt
>
> Application Logs attached
>
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]