Hi Dima,
Thanks for the fast response,
Unfortunately this is not working for me, I tried :

hadoop jar /usr/lib/hbase/hbase-server-0.98.6-cdh5.2.1.jar rowcounter
-libjars /usr/lib/hbase/lib/hbase-client-0.98.6-cdh5.2.1.jar "mytable"
and
hadoop jar /usr/lib/hbase/hbase-server-0.98.6-cdh5.2.1.jar rowcounter
"mytable" -libjars /usr/lib/hbase/lib/hbase-client-0.98.6-cdh5.2.1.jar

Same error !

In addition this is working:
ls /usr/lib/hbase/lib/hbase-client-0.98.6-cdh5.2.1.jar
/usr/lib/hbase/lib/hbase-client-0.98.6-cdh5.2.1.jar

I also tried to run my own fat jars that was working on cdh4, but after I
compiled it with the cdh5 (same version as the cluster) I get the same
error.
I guess this is an hbase environment issue, but I can't put my finger on it.







On Wed, Dec 10, 2014 at 9:56 AM, Yaniv Yancovich <[email protected]>
wrote:

>
> ---------- Forwarded message ----------
> From: Dima Spivak <[email protected]>
> Date: Tue, Dec 9, 2014 at 11:23 PM
> Subject: Re: My cdh5.2 cluster get FileNotFoundException when running
> hbase MR jobs
> To: "[email protected]" <[email protected]>
> Cc: Yaniv Yancovich <[email protected]>
>
>
> Dear Ehud,
>
> You need the -libjars <jar> argument to move the dependency on your local
> file system into HDFS (the error is because that JAR is not there).
>
> -Dima
>
> On Tue, Dec 9, 2014 at 1:05 AM, Ehud Lev <[email protected]> wrote:
>
>> My cdh5.2 cluster has a problem to run hbase MR jobs.
>>
>> For example, I added the hbase classpath into the hadoop classpath:
>> vi /etc/hadoop/conf/hadoop-env.sh
>> add the line:
>> export HADOOP_CLASSPATH="/usr/lib/hbase/bin/hbase
>> classpath:$HADOOP_CLASSPATH"
>>
>> And when I am running:
>> hadoop jar /usr/lib/hbase/hbase-server-0.98.6-cdh5.2.1.jar rowcounter
>> "mytable"
>>
>> I get the following exception:
>>
>> 14/12/09 03:44:02 WARN security.UserGroupInformation:
>> PriviledgedActionException as:root (auth:SIMPLE)
>> cause:java.io.FileNotFoundException: File does not exist:
>> hdfs://le-hds3-hb2/usr/lib/hbase/lib/hbase-client-0.98.6-cdh5.2.1.jar
>> Exception in thread "main" java.lang.reflect.InvocationTargetException
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>         at
>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>         at
>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>         at java.lang.reflect.Method.invoke(Method.java:606)
>>         at org.apache.hadoop.hbase.mapreduce.Driver.main(Driver.java:54)
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>         at
>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>         at
>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>         at java.lang.reflect.Method.invoke(Method.java:606)
>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
>> Caused by: java.io.FileNotFoundException: File does not exist:
>> hdfs://le-hds3-hb2/usr/lib/hbase/lib/hbase-client-0.98.6-cdh5.2.1.jar
>>         at
>>
>> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1083)
>>         at
>>
>> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1075)
>>         at
>>
>> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>>         at
>>
>> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1075)
>>         at
>>
>> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)
>>         at
>>
>> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224)
>>         at
>>
>> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93)
>>         at
>>
>> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57)
>>         at
>>
>> org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265)
>>         at
>>
>> org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
>>         at
>>
>> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:394)
>>         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1295)
>>         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1292)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at javax.security.auth.Subject.doAs(Subject.java:415)
>>         at
>>
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
>>         at org.apache.hadoop.mapreduce.Job.submit(Job.java:1292)
>>         at
>> org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1313)
>>         at
>> org.apache.hadoop.hbase.mapreduce.RowCounter.main(RowCounter.java:191)
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>         at
>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>         at
>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>         at java.lang.reflect.Method.invoke(Method.java:606)
>>         at
>>
>> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
>>         at
>> org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:145)
>>         at
>> org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:153)
>>
>
>
>

Reply via email to