Hi All
Please help

hive> insert into emp values (1, "rakesh");Query ID = 
rakesh_20150820223622_e538e1d0-26b0-4747-a553-e68a96d58954Total jobs = 
3Launching Job 1 out of 3Number of reduce tasks is set to 0 since there's no 
reduce operatorjava.io.FileNotFoundException: File does not exist: 
hdfs://localhost:54310/usr/local/apache-hive-1.2.0-bin/lib/derby-10.11.1.1.jar  
   at 
org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1122)
      at 
org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
      at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
  at 
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
  at 
org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)
    at 
org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224)
    at 
org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93)
       at 
org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57)
   at 
org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:269)
        at 
org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:390)
        at 
org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:483)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1296)        at 
org.apache.hadoop.mapreduce.Job$10.run(Job.java:1293)        at 
java.security.AccessController.doPrivileged(Native Method)   at 
javax.security.auth.Subject.doAs(Subject.java:422)   at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
 at org.apache.hadoop.mapreduce.Job.submit(Job.java:1293)        at 
org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562) at 
org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557) at 
java.security.AccessController.doPrivileged(Native Method)   at 
javax.security.auth.Subject.doAs(Subject.java:422)   at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
 at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557)    
 at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548)     at 
org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:431)    at 
org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:137)    at 
org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)       at 
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88)  at 
org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1650)        at 
org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1409)   at 
org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1192)       at 
org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059)       at 
org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049)       at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:213)     at 
org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:165)  at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:376) at 
org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:736)       at 
org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681) at 
org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)        at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)  at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)   
     at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)     at 
org.apache.hadoop.util.RunJar.run(RunJar.java:221)   at 
org.apache.hadoop.util.RunJar.main(RunJar.java:136)Job Submission failed with 
exception 'java.io.FileNotFoundException(File does not exist: 
hdfs://localhost:54310/usr/local/apache-hive-1.2.0-bin/lib/derby-10.11.1.1.jar)'FAILED:
 Execution Error, return code 1 from 
org.apache.hadoop.hive.ql.exec.mr.MapRedTaskhive> 
I am getting this error while inserting values into hive table.I am running 
hadoop ie (dfs and yarn is running). What surprises me is the path 
hdfs://localhost:54310/usr/local/apache-hive-1.2.0-bin/lib/derby-10.11.1.1.jarwhich
 is a combination of default.fs in core-site.xml and the rest is hive_home. Why 
is it looking for the derby jar in the hdfs path
thanksrakesh                                      

Reply via email to