Alejandro Fernandez created AMBARI-7618:
-------------------------------------------

             Summary: Hive load local command fails due to not finding 
hcatalog-core.jar 
                 Key: AMBARI-7618
                 URL: https://issues.apache.org/jira/browse/AMBARI-7618
             Project: Ambari
          Issue Type: Bug
          Components: ambari-server
    Affects Versions: 1.7.0
            Reporter: Alejandro Fernandez
            Assignee: Alejandro Fernandez
             Fix For: 1.7.0


Created a cluster using Ambari with the latest Champlain bits on a centos 6.4 
VM.

Then attempted to load some sample data,
{code}
cd /tmp
wget http://seanlahman.com/files/database/lahman591-csv.zip
unzip lahman591-csv.zip
su - hive
/usr/hdp/2.2.0.0-806/hadoop/bin/hadoop fs -copyFromLocal Schools.csv /tmp
/usr/hdp/2.2.0.0-806/hive/bin/hive
CREATE TABLE school (id STRING, name STRING, city STRING, state STRING, nick 
STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\054' STORED AS TEXTFILE;
LOAD DATA LOCAL INPATH '/tmp/Schools.csv' INTO TABLE school;
SELECT name FROM school ORDER BY name ASC LIMIT 10;
{code}


The SELECT statement fails with the following error,
{code}
java.io.FileNotFoundException: File 
file:/usr/lib/hcatalog/share/hcatalog/hcatalog-core.jar does not exist
        at 
org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:524)
        at 
org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:737)
        at 
org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:514)
        at 
org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:409)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:337)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:289)
        at 
org.apache.hadoop.mapreduce.JobSubmitter.copyRemoteFiles(JobSubmitter.java:140)
        at 
org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:213)
        at 
org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
        at 
org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:394)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1294)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1291)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:1291)
        at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562)
        at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
        at 
org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557)
        at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548)
        at 
org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:420)
        at 
org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:136)
        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:161)
        at 
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
        at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1603)
        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1363)
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1176)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1003)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:993)
        at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:246)
        at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:198)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:408)
        at 
org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:781)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Job Submission failed with exception 'java.io.FileNotFoundException(File 
file:/usr/lib/hcatalog/share/hcatalog/hcatalog-core.jar does not exist)'
FAILED: Execution Error, return code 1 from 
org.apache.hadoop.hive.ql.exec.mr.MapRedTask
{code}

The hive property in hive-site.xml for HDP 2.2 that controls where to load user 
defined jars from needs to change now that HDP has versioned RPMs.

This property should contain
{code}
hive.aux.jars.path=file:///usr/hdp/current/hive-hcatalog/share/hcatalog
{code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to