[
https://issues.apache.org/jira/browse/SPARK-7110?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14511007#comment-14511007
]
Thomas Graves commented on SPARK-7110:
--------------------------------------
So with the NewApi's, the call to: val job = new NewAPIHadoopJob(hadoopConf)
automatically adds credentials for you. Atleast normally it does. What version
of Hadoop are you using?
So this sometimes works, sometimes doesn't? Is it similar to what is described
in SPARK-1203? I think all I did to reproduce that was in spark-shell run a
bunch of stuff, then before doing the saveAs I just waited a while. Waiting
basically allows the hadoop Filesystems to close and then when you go to re
open it doesn't have the necessary credentials. I think it was only a few
minutes.
> when use saveAsNewAPIHadoopFile, sometimes it throws "Delegation Token can be
> issued only with kerberos or web authentication"
> ------------------------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-7110
> URL: https://issues.apache.org/jira/browse/SPARK-7110
> Project: Spark
> Issue Type: Bug
> Components: YARN
> Affects Versions: 1.1.0
> Reporter: gu-chi
>
> Under yarn-client mode, this issue random occurs. Authentication method is
> set to kerberos, and use "saveAsNewAPIHadoopFile" in PairRDDFunctions to save
> data to HDFS, then exception comes as:
> org.apache.hadoop.ipc.RemoteException(java.io.IOException): Delegation Token
> can be issued only with kerberos or web authentication
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]