thanks for the response Suraj!

yes i checked the value being set and i removed all wild cards
/usr/lib/hbase/conf:/usr/java/default/lib/tools.jar:/usr/lib/hbase:/usr/lib/hbase/hbase-0.90.1-cdh3u0.jar:/usr/lib/hbase/hbase-0.90.1-cdh3u0-tests.jar:/usr/lib/hbase/lib/activation-1.1.jar:/usr/lib/hbase/lib/asm-3.1.jar:/usr/lib/hbase/lib/avro-1.3.3.jar:/usr/lib/hbase/lib/commons-cli-1.2.jar:/usr/lib/hbase/lib/commons-codec-1.4.jar:/usr/lib/hbase/lib/commons-el-1.0.jar:/usr/lib/hbase/lib/commons-httpclient-3.1.jar:/usr/lib/hbase/lib/commons-lang-2.5.jar:/usr/lib/hbase/lib/commons-logging-1.1.1.jar:/usr/lib/hbase/lib/commons-net-1.4.1.jar:/usr/lib/hbase/lib/core-3.1.1.jar:/usr/lib/hbase/lib/guava-r06.jar:/usr/lib/hbase/lib/hadoop-core.jar:/usr/lib/hbase/lib/hbase-0.90.1-cdh3u0.jar:/usr/lib/hbase/lib/jackson-core-asl-1.5.2.jar:/usr/lib/hbase/lib/jackson-jaxrs-1.5.5.jar:/usr/lib/hbase/lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hbase/lib/jackson-xc-1.5.5.jar:/usr/lib/hbase/lib/jasper-compiler-5.5.23.jar:/usr/lib/hbase/lib/jasper-runtime-5.5.23.jar:/usr/lib/h
base/lib/jaxb-api-2.1.jar:/usr/lib/hbase/lib/jaxb-impl-2.1.12.jar:/usr/lib/hbase/lib/jersey-core-1.4.jar:/usr/lib/hbase/lib/jersey-json-1.4.jar:/usr/lib/hbase/lib/jersey-server-1.4.jar:/usr/lib/hbase/lib/jettison-1.1.jar:/usr/lib/hbase/lib/jetty-6.1.26.jar:/usr/lib/hbase/lib/jetty-util-6.1.26.jar:/usr/lib/hbase/lib/jruby-complete-1.0.3.jar:/usr/lib/hbase/lib/jsp-2.1-6.1.14.jar:/usr/lib/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/lib/hbase/lib/jsp-api-2.1.jar:/usr/lib/hbase/lib/jsr311-api-1.1.1.jar:/usr/lib/hbase/lib/log4j-1.2.16.jar:/usr/lib/hbase/lib/protobuf-java-2.3.0.jar:/usr/lib/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hbase/lib/servlet-api-2.5.jar:/usr/lib/hbase/lib/slf4j-api-1.5.8.jar:/usr/lib/hbase/lib/slf4j-log4j12-1.5.8.jar:/usr/lib/hbase/lib/stax-api-1.0.1.jar:/usr/lib/hbase/lib/thrift-0.2.0.jar:/usr/lib/hbase/lib/xmlenc-0.52.jar:/usr/lib/hbase/lib/zookeeper.jar:/usr/lib/zookeeper/zookeeper-3.3.3-cdh3u0.jar:/usr/lib/hadoop/hadoop-0.20.2-cdh3u0
-core.jar



Also when the mapper runs it prints the environment as

12/02/08 12:31:09 INFO zookeeper.ZooKeeper: Client
 
environment:java.class.path=/usr/lib/hadoop-0.20/conf:/usr/java/default/lib/tools.jar:/usr/lib/hadoop-0.20:/usr/lib/hadoop-0.20/hadoop-core-0.20.2-cdh3u0.jar:/usr/lib/hadoop-0.20/lib/aes.jar:/usr/lib/hadoop-0.20/lib/ant-contrib-1.0b3.jar:/usr/lib/hadoop-0.20/lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop-0.20/lib/aspectjtools-1.6.5.jar:/usr/lib/hadoop-0.20/lib/commons-cli-1.2.jar:/usr/lib/hadoop-0.20/lib/commons-codec-1.4.jar:/usr/lib/hadoop-0.20/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop-0.20/lib/commons-el-1.0.jar:/usr/lib/hadoop-0.20/lib/commons-httpclient-3.0.1.jar:/usr/lib/hadoop-0.20/lib/commons-logging-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-net-1.4.1.jar:/usr/lib/hadoop-0.20/lib/core-3.1.1.jar:/usr/lib/hadoop-0.20/lib/hadoop-capacity-scheduler-0.20.2-cdh3u0.jar:/usr/lib/hadoop-0.20/lib/hadoop-datajoin-0.20.2-cdh3u0.jar:/usr/lib/hadoop-0.20/lib/hadoop-failmon-0.20.2-cdh3u0.jar:/usr/lib/hadoop-
0.20/lib/hadoop-fairscheduler-0.20.2-cdh3u0.jar:/usr/lib/hadoop-0.20/lib/hadoop-gridmix-0.20.2-cdh3u0.jar:/usr/lib/hadoop-0.20/lib/hadoop-index-0.20.2-cdh3u0.jar:/usr/lib/hadoop-0.20/lib/hadoop-lzo-0.4.10.jar:/usr/lib/hadoop-0.20/lib/hadoop-mrunit-0.20.2-cdh3u0.jar:/usr/lib/hadoop-0.20/lib/hadoop-streaming-0.20.2-cdh3u0.jar:/usr/lib/hadoop-0.20/lib/hadoop-thriftfs-0.20.2-cdh3u0.jar:/usr/lib/hadoop-0.20/lib/hadoop-vaidya-0.20.2-cdh3u0.jar:/usr/lib/hadoop-0.20/lib/hdfsproxy-2.0.jar:/usr/lib/hadoop-0.20/lib/hsqldb-1.8.0.10.jar:/usr/lib/hadoop-0.20/lib/jackson-core-asl-1.5.2.jar:/usr/lib/hadoop-0.20/lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hadoop-0.20/lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop-0.20/lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop-0.20/lib/jets3t-0.6.1.jar:/usr/lib/hadoop-0.20/lib/jetty-6.1.26.jar:/usr/lib/hadoop-0.20/lib/jetty-servlet-tester-6.1.26.jar:/usr/lib/hadoop-0.20/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-0.20/lib/jsch-0.1.42.jar
:/usr/lib/hadoop-0.20/lib/junit-4.5.jar:/usr/lib/hadoop-0.20/lib/kfs-0.2.2.jar:/usr/lib/hadoop-0.20/lib/libthrift.jar:/usr/lib/hadoop-0.20/lib/log4j-1.2.15.jar:/usr/lib/hadoop-0.20/lib/lucene-core-2.3.1.jar:/usr/lib/hadoop-0.20/lib/mockito-all-1.8.2.jar:/usr/lib/hadoop-0.20/lib/mysql-connector-java-5.1.15-bin.jar:/usr/lib/hadoop-0.20/lib/nzjdbc.jar:/usr/lib/hadoop-0.20/lib/oro-2.0.8.jar:/usr/lib/hadoop-0.20/lib/servlet-api-2.5-20081211.jar:/usr/lib/hadoop-0.20/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hadoop-0.20/lib/slf4j-api-1.4.3.jar:/usr/lib/hadoop-0.20/lib/slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop-0.20/lib/thrift-fb303-0.5.0.jar:/usr/lib/hadoop-0.20/lib/xmlenc-0.52.jar:/usr/lib/hadoop-0.20/lib/jsp-2.1/jsp-2.1.jar:/usr/lib/hadoop-0.20/lib/jsp-2.1/jsp-api-2.1.jar:/usr/lib/zookeeper/zookeeper-3.3.3-cdh3u0.jar:/usr/lib/hbase/hbase-0.90.1-cdh3u0.jar:/usr/lib/hadoop/hadoop-0.20.2-cdh3u0-core.jar:/usr/lib/hbase/hbase-0.90.1-cdh3u0-tests.jar:/usr/lib/hbase/conf/



Please help !! Should i be copying over these jars to distributed cache myself?

thanks
Vrushali



________________________________
 From: Suraj Varma <[email protected]>
To: [email protected]; Vrushali C <[email protected]> 
Sent: Wednesday, February 8, 2012 12:26 AM
Subject: Re: hbase - CassNotFound while connecting through mapper
 
Perhaps your HADOOP_CLASSPATH is not getting set properly.

>> export HADOOP_CLASSPATH=`hbase classpath`:$ZK_CLASSPATH:$HADOOP_CLASSPATH

Can you set the absolute path to hbase above? Also - try echo-ing the
hadoop classpath to ensure that HADOOP_CLASSPATH indeed has the hbase
jars & conf directory.
--Suraj

On Tue, Feb 7, 2012 at 11:31 PM, Vrushali C <[email protected]> wrote:
>
>
> I am trying to connect to an hbase table through a mapper's setup method. But 
> it throws a class not found exception.
>
> I have been reading the forums and understood I should be checking if hbase 
> is on the hadoop classpath. I also looked at
>
> http://hbase.apache.org/docs/current/api/org/apache/hadoop/hbase/mapreduce/package-summary.html#classpath
>
> According, I have tried both with  setting -libjars as well as hadoop 
> classpath. I expanded all the jars in `hbase classpath` and mentioned them on 
> the -libjars option.
>
> I still get a Error: java.lang.ClassNotFoundException: 
> org.apache.hadoop.hbase.HBaseConfiguration
>  or a
>
> Error: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.client.HTable
> if I try to access HBaseConfiguration or HTable in the mapper code.
>
> Attempt at setting classpath
> export ZK_HOME=/usr/lib/zookeeper
> export ZK_CLASSPATH=$ZK_HOME/zookeeper-3.3.3-cdh3u0.jar
> export HADOOP_CLASSPATH=`hbase classpath`:$ZK_CLASSPATH:$HADOOP_CLASSPATH
>
> attempt at including in libjars.
>
>     -libjars
>
  
/usr/lib/hbase/conf,/usr/java/default/lib/tools.jar,/usr/lib/hbase,/usr/lib/hbase/hbase-0.90.1-cdh3u0.jar,/usr/lib/hbase/hbase-0.90.1-cdh3u0-tests.jar,/usr/lib/hbase/lib/activation-1.1.jar,/usr/lib/hbase/lib/asm-3.1.jar,/usr/lib/hbase/lib/avro-1.3.3.jar,/usr/lib/hbase/lib/commons-cli-1.2.jar,/usr/lib/hbase/lib/commons-codec-1.4.jar,/usr/lib/hbase/lib/commons-el-1.0.jar,/usr/lib/hbase/lib/commons-httpclient-3.1.jar,/usr/lib/hbase/lib/commons-lang-2.5.jar,/usr/lib/hbase/lib/commons-logging-1.1.1.jar,/usr/lib/hbase/lib/commons-net-1.4.1.jar,/usr/lib/hbase/lib/core-3.1.1.jar,/usr/lib/hbase/lib/guava-r06.jar,/usr/lib/hbase/lib/hadoop-core.jar,/usr/lib/hbase/lib/hbase-0.90.1-cdh3u0.jar,/usr/lib/hbase/lib/jackson-core-asl-1.5.2.jar,/usr/lib/hbase/lib/jackson-jaxrs-1.5.5.jar,/usr/lib/hbase/lib/jackson-mapper-asl-1.5.2.jar,/usr/lib/hbase/lib/jackson-xc-1.5.5.jar,/usr/lib/hbase/lib/jasper-compiler-5.5.23.jar,/usr/lib/hbase/lib/jasper-runtime-5.5.23.jar,/usr/lib/
> hbase/lib/jaxb-api-2.1.jar,/usr/lib/hbase/lib/jaxb-impl-2.1.12.jar,/usr/lib/hbase/lib/jersey-core-1.4.jar,/usr/lib/hbase/lib/jersey-json-1.4.jar,/usr/lib/hbase/lib/jersey-server-1.4.jar,/usr/lib/hbase/lib/jettison-1.1.jar,/usr/lib/hbase/lib/jetty-6.1.26.jar,/usr/lib/hbase/lib/jetty-util-6.1.26.jar,/usr/lib/hbase/lib/jruby-complete-1.0.3.jar,/usr/lib/hbase/lib/jsp-2.1-6.1.14.jar,/usr/lib/hbase/lib/jsp-api-2.1-6.1.14.jar,/usr/lib/hbase/lib/jsp-api-2.1.jar,/usr/lib/hbase/lib/jsr311-api-1.1.1.jar,/usr/lib/hbase/lib/log4j-1.2.16.jar,/usr/lib/hbase/lib/protobuf-java-2.3.0.jar,/usr/lib/hbase/lib/servlet-api-2.5-6.1.14.jar,/usr/lib/hbase/lib/servlet-api-2.5.jar,/usr/lib/hbase/lib/slf4j-api-1.5.8.jar,/usr/lib/hbase/lib/slf4j-log4j12-1.5.8.jar,/usr/lib/hbase/lib/stax-api-1.0.1.jar,/usr/lib/hbase/lib/thrift-0.2.0.jar,/usr/lib/hbase/lib/xmlenc-0.52.jar,/usr/lib/hbase/lib/zookeeper.jar,/etc/zookeeper,/etc/hadoop-0.20/conf,/usr/lib/hbase/hbase-0.90.1-cdh3u0.jar,/usr/
> lib/hbase/conf,/usr/lib/hbase/conf,/usr/lib/hbase/hbase-0.90.1-cdh3u0.jar,/usr/lib/hadoop/hadoop-0.20.2-cdh3u0-core.jar,/usr/lib/zookeeper/zookeeper-3.3.3-cdh3u0.jar
>
> Would appreciate any help!!
>
>
> Stack trace:
> 12/02/07 23:24:21 INFO mapred.JobClient: Task Id : 
> attempt_201202071551_0032_m_000000_2, Status : FAILED
> Error: java.lang.ClassNotFoundException: 
> org.apache.hadoop.hbase.HBaseConfiguration
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
>     at java.lang.Class.forName0(Native Method)
>     at java.lang.Class.forName(Class.java:247)
>     at 
> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:943)
>     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:994)
>     at 
> org.apache.hadoop.mapreduce.JobContext.getMapperClass(JobContext.java:212)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:601)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:322)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:396)
>     at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115)
>     at org.apache.hadoop.mapred.Child.main(Child.java:262)
>
>
> Thanks
> Vrushali

Reply via email to