[ 
https://issues.apache.org/jira/browse/HCATALOG-80?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13083260#comment-13083260
 ] 

Sushanth Sowmyan commented on HCATALOG-80:
------------------------------------------

A quick code search reveals that is not the case, at least for the hadoop 
version we pull in from hive.

--
tundra:hcat sush$ find . -name \*java | xargs grep 
mapreduce.job.credentials.binary
./hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/src/mapred/org/apache/hadoop/mapred/JobClient.java:
      conf.get("mapreduce.job.credentials.binary");
./hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/src/mapred/org/apache/hadoop/mapreduce/security/TokenCache.java:
            conf.get("mapreduce.job.credentials.binary");
--



> NullPointerException when using and running har in non-secure mode
> ------------------------------------------------------------------
>
>                 Key: HCATALOG-80
>                 URL: https://issues.apache.org/jira/browse/HCATALOG-80
>             Project: HCatalog
>          Issue Type: Bug
>            Reporter: Sushanth Sowmyan
>            Assignee: Sushanth Sowmyan
>            Priority: Trivial
>         Attachments: HCATALOG-80.patch
>
>
> Running tests in non-secure mode with har enabled fails because of a NPE 
> caused by trying to set mapreduce.job.credentials.binary in Configuration 
> with null. This is not hit by default because we don't use har in our default 
> configs, and our har testing has primarily been on secure mode.

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

Reply via email to