[
https://issues.apache.org/jira/browse/HIVE-15767?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16117614#comment-16117614
]
Xuefu Zhang commented on HIVE-15767:
------------------------------------
>From what I see, the patch seems logical, harmless at least. What I don't
>understand is that why Spark would attempt reading this file. As a side note,
>I didn't find the source in Spark code base that does this.
> Hive On Spark is not working on secure clusters from Oozie
> ----------------------------------------------------------
>
> Key: HIVE-15767
> URL: https://issues.apache.org/jira/browse/HIVE-15767
> Project: Hive
> Issue Type: Bug
> Components: Spark
> Affects Versions: 1.2.1, 2.1.1
> Reporter: Peter Cseh
> Assignee: Peter Cseh
> Attachments: HIVE-15767-001.patch, HIVE-15767-002.patch,
> HIVE-15767.1.patch
>
>
> When a HiveAction is launched form Oozie with Hive On Spark enabled, we're
> getting errors:
> {noformat}
> Caused by: java.io.IOException: Exception reading
> file:/yarn/nm/usercache/yshi/appcache/application_1485271416004_0022/container_1485271416004_0022_01_000002/container_tokens
> at
> org.apache.hadoop.security.Credentials.readTokenStorageFile(Credentials.java:188)
> at
> org.apache.hadoop.mapreduce.security.TokenCache.mergeBinaryTokens(TokenCache.java:155)
> {noformat}
> This is caused by passing the {{mapreduce.job.credentials.binary}} property
> to the Spark configuration in RemoteHiveSparkClient.
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)