[
https://issues.apache.org/jira/browse/AMBARI-11312?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14555579#comment-14555579
]
Hadoop QA commented on AMBARI-11312:
------------------------------------
{color:red}-1 overall{color}. Here are the results of testing the latest
attachment
http://issues.apache.org/jira/secure/attachment/12734726/ambari-11312-v6.txt
against trunk revision .
{color:green}+1 @author{color}. The patch does not contain any @author
tags.
{color:green}+1 tests included{color}. The patch appears to include 5 new
or modified test files.
{color:green}+1 javac{color}. The applied patch does not increase the
total number of javac compiler warnings.
{color:green}+1 release audit{color}. The applied patch does not increase
the total number of release audit warnings.
{color:red}-1 core tests{color}. The patch failed these unit tests in
ambari-server:
org.apache.ambari.server.controller.internal.StackArtifactResourceProviderTest
Test results:
https://builds.apache.org/job/Ambari-trunk-test-patch/2820//testReport/
Console output:
https://builds.apache.org/job/Ambari-trunk-test-patch/2820//console
This message is automatically generated.
> Use default value of hbase.tmp.dir instead of hard coded directory
> ------------------------------------------------------------------
>
> Key: AMBARI-11312
> URL: https://issues.apache.org/jira/browse/AMBARI-11312
> Project: Ambari
> Issue Type: Bug
> Reporter: Ted Yu
> Assignee: Ted Yu
> Attachments: ambari-11312-v1.txt, ambari-11312-v2.txt,
> ambari-11312-v3.txt, ambari-11312-v4.txt, ambari-11312-v5.txt,
> ambari-11312-v6.txt
>
>
> Currently hbase.tmp.dir is hard coded with /hadoop/hbase .
> We saw the following error when testing Phoenix UDF functionality:
> {code}
> 15/05/17 14:56:11 WARN util.DynamicClassLoader: Failed to load new jar
> UDF_artifact-1.0-SNAPSHOT.jar
> java.io.FileNotFoundException:
> /grid/0/var/log/hbase/local/jars/UDF_artifact-1.0-SNAPSHOT.jar (Permission
> denied)
> at java.io.FileOutputStream.open(Native Method)
> at java.io.FileOutputStream.<init>(FileOutputStream.java:221)
> at
> org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.<init>(RawLocalFileSystem.java:222)
> at
> org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.<init>(RawLocalFileSystem.java:209)
> at
> org.apache.hadoop.fs.RawLocalFileSystem.createOutputStreamWithMode(RawLocalFileSystem.java:305)
> at
> org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:293)
> at
> org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:326)
> {code}
> Default value for hbase.tmp.dir should be used instead.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)