Hi All,

I am trying to upload a csv file using hbase bulk upload feature. Below is the 
URL which I referred:

http://hbase.apache.org/bulk-loads.html

I have the following problem that may be you or someone can help me out with. I 
am new to hadoop and the mapreduce feature. I tried to run my mapreduce program 
and I am getting the following error message below.

12/03/06 00:58:13 INFO mapred.JobClient: Cleaning up the staging area 
hdfs://master/hadoop_home/tmp_directory/mapred/staging/hadoop/.staging/job_201203052033_0011
Exception in thread "main" java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
        at java.lang.reflect.Method.invoke(Unknown Source)
        at org.apache.hadoop.hbase.mapreduce.Driver.main(Driver.java:51)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
        at java.lang.reflect.Method.invoke(Unknown Source)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: org.apache.hadoop.mapreduce.lib.input.InvalidInputException: Input 
path does not exist: hdfs://master/myHbaseInputDataDir
        at 
org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:235)
        at 
org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:252)
        at org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:902)
        at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:919)
        at org.apache.hadoop.mapred.JobClient.access$500(JobClient.java:170)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:838)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:791)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Unknown Source)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
        at 
org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:791)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:465)
        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:494)
        at org.apache.hadoop.hbase.mapreduce.ImportTsv.main(ImportTsv.java:313)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
        at java.lang.reflect.Method.invoke(Unknown Source)
        at 
org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
        at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
        ... 10 more


I set the input directory as /myHbaseInputDataDir and my output directory as 
/myHbaseOutputDataDir  when I try to run the jar file by running the following 
command:

HADOOP_CLASSPATH=`hbase classpath` $HADOOP_HOME/bin/hadoop jar 
/hbase_home/hbase-0.92.0/hbase-0.92.0.jar importtsv  
-Dimporttsv.bulk.output=/myHbaseOutputDataDir   
-Dimporttsv.columns=HBASE_ROW_KEY,SerialNumber,Name,Asset Tblassets 
/myHbaseInputDataDir

However, I seem to be getting an exception about a different input path:
Input path does not exist: hdfs://master/myHbaseInputDataDir. Why is it looking 
for this input path shouldn't it be looking for? Any help or clarification as 
to what I am doing wrong would be appreciated.

Thanks
Rinku Garg



_____________
The information contained in this message is proprietary and/or confidential. 
If you are not the intended recipient, please: (i) delete the message and all 
copies; (ii) do not disclose, distribute or use the message in any manner; and 
(iii) notify the sender immediately. In addition, please be aware that any 
message addressed to our domain is subject to archiving and review by persons 
other than the intended recipient. Thank you.

Reply via email to