-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26285/
-----------------------------------------------------------

Review request for Ambari, Hari Sankar Sivarama Subramaniyan, Sumit Mohanty, 
and Sid Wagle.


Bugs: AMBARI-7618
    https://issues.apache.org/jira/browse/AMBARI-7618


Repository: ambari


Description
-------

Created a cluster using Ambari with the latest Champlain bits on a centos 6.4 
VM.
Then attempted to load some sample data,

cd /tmp
wget http://seanlahman.com/files/database/lahman591-csv.zip
unzip lahman591-csv.zip
su - hive
/usr/hdp/2.2.0.0-806/hadoop/bin/hadoop fs -copyFromLocal Schools.csv /tmp
/usr/hdp/2.2.0.0-806/hive/bin/hive
CREATE TABLE school (id STRING, name STRING, city STRING, state STRING, nick 
STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\054' STORED AS TEXTFILE;
LOAD DATA LOCAL INPATH '/tmp/Schools.csv' INTO TABLE school;
SELECT name FROM school ORDER BY name ASC LIMIT 10;
The SELECT statement fails with the following error,
java.io.FileNotFoundException: File 
file:/usr/lib/hcatalog/share/hcatalog/hcatalog-core.jar does not exist
        at 
org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:524)
        at 
org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:737)
        at 
org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:514)
        at 
org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:409)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:337)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:289)
        at 
org.apache.hadoop.mapreduce.JobSubmitter.copyRemoteFiles(JobSubmitter.java:140)
        at 
org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:213)
        at 
org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
        at 
org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:394)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1294)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1291)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:1291)
        at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562)
        at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
        at 
org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557)
        at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548)
        at 
org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:420)
        at 
org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:136)
        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:161)
        at 
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
        at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1603)
        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1363)
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1176)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1003)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:993)
        at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:246)
        at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:198)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:408)
        at 
org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:781)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Job Submission failed with exception 'java.io.FileNotFoundException(File 
file:/usr/lib/hcatalog/share/hcatalog/hcatalog-core.jar does not exist)'
FAILED: Execution Error, return code 1 from 
org.apache.hadoop.hive.ql.exec.mr.MapRedTask


The hive property in hive-site.xml for HDP 2.2 that controls where to load user 
defined jars from needs to change now that HDP has versioned RPMs.
This property should contain
hive.aux.jars.path=file:///usr/hdp/current/hive-hcatalog/share/hcatalog


Diffs
-----

  
ambari-server/src/main/resources/stacks/HDP/2.2/services/HIVE/configuration/hive-site.xml
 4deb489 

Diff: https://reviews.apache.org/r/26285/diff/


Testing
-------

Set the property manually inside the hive shell, and was able to run the SELECT 
statement.
e.g.,
[root@c6403 ~]# su - hive
[hive@c6403 ~]$ /usr/hdp/current/hive/bin/hive
hive> set 
hive.aux.jars.path=file:///usr/hdp/current/hive-hcatalog/share/hcatalog/*.jar;
hive> select state, count(id) as counts from school group by state sort by 
counts desc limit 10;


Thanks,

Alejandro Fernandez

Reply via email to