Okay, this problem is resolved now. Here is where things were going wrong in my code:
I was using "Configured implements Tool" in my main driver code and I should have used getConf() in the run function instead of creating a new configuration again. This new conf was overriding the libjars parameters of conf created in main. Thanks Vrushali ________________________________ From: Vrushali C <[email protected]> To: Suraj Varma <[email protected]>; "[email protected]" <[email protected]> Sent: Wednesday, February 8, 2012 6:05 PM Subject: Re: hbase - CassNotFound while connecting through mapper Yes, the libjars parameter comes after the map reduce driver. The hbase rowcounter works and connecting/accessing hbase tables works remotely as well as through the main driver program that creates the job conf. It's only the mapper that throws a "ClassNotFoundException" I also tried setting the class path inside the mapper code by doing a System.setProperty... now it prints the class path from within the mapper as follows , but the error is still there.. attempt_201202071551_0129_m_ 000000_2: -----------------CP in code !! ---------------------/usr/lib/zookeeper/zookeeper-3.3.3-cdh3u0.jar:/usr/lib/hbase/hbase-0.90.1-cdh3u0.jar:/usr/lib/hadoop/hadoop-0.20.2-cdh3u0-core.jar:/usr/lib/hbase/hbase-0.90.1-cdh3u0-tests.jar:/usr/lib/hbase/conf/:/usr/lib/hadoop/conf:/usr/java/jdk1.6.0_24/lib/tools.jar:/usr/lib/hadoop-0.20/bin/..:/usr/lib/hadoop-0.20/bin/../hadoop-core-0.20.2-cdh3u0.jar:/usr/lib/hadoop-0.20/bin/../lib/aes.jar:/usr/lib/hadoop-0.20/bin/../lib/ant-contrib-1.0b3.jar:/usr/lib/hadoop-0.20/bin/../lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop-0.20/bin/../lib/aspectjtools-1.6.5.jar:/usr/lib/hadoop-0.20/bin/../lib/commons-cli-1.2.jar:/usr/lib/hadoop-0.20/bin/../lib/commons-codec-1.4.jar:/usr/lib/hadoop-0.20/bin/../lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop-0.20/bin/../lib/commons-el-1.0.jar:/usr/lib/hadoop-0.20/bin/../lib/commons-httpclient-3.0.1.jar:/usr/lib/hadoop-0.20/bin/../lib/commons-logging-1.0.4.jar:/usr/lib/hadoop-0.20/bin/../lib/commons-logging-api-1.0.4.jar:/usr/lib/ hadoop-0.20/bin/../lib/commons-net-1.4.1.jar:/usr/lib/hadoop-0.20/bin/../lib/core-3.1.1.jar:/usr/lib/hadoop-0.20/bin/../lib/hadoop-fairscheduler-0.20.2-cdh3u0.jar:/usr/lib/hadoop-0.20/bin/../lib/hadoop-lzo-0.4.10.jar:/usr/lib/hadoop-0.20/bin/../lib/hsqldb-1.8.0.10.jar:/usr/lib/hadoop-0.20/bin/../lib/jackson-core-asl-1.5.2.jar:/usr/lib/hadoop-0.20/bin/../lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hadoop-0.20/bin/../lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop-0.20/bin/../lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop-0.20/bin/../lib/jets3t-0.6.1.jar:/usr/lib/hadoop-0.20/bin/../lib/jetty-6.1.26.jar:/usr/lib/hadoop-0.20/bin/../lib/jetty-servlet-tester-6.1.26.jar:/usr/lib/hadoop-0.20/bin/../lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-0.20/bin/../lib/jsch-0.1.42.jar:/usr/lib/hadoop-0.20/bin/../lib/junit-4.5.jar:/usr/lib/hadoop-0.20/bin/../lib/kfs-0.2.2.jar:/usr/lib/hadoop-0.20/bin/../lib/log4j-1.2.15.jar:/usr/lib/hadoop-0.20/bin/../lib/mockito-all-1.8.2.jar:/u sr/lib/hadoop-0.20/bin/../lib/nzjdbc.jar:/usr/lib/hadoop-0.20/bin/../lib/oro-2.0.8.jar:/usr/lib/hadoop-0.20/bin/../lib/servlet-api-2.5-20081211.jar:/usr/lib/hadoop-0.20/bin/../lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hadoop-0.20/bin/../lib/slf4j-api-1.4.3.jar:/usr/lib/hadoop-0.20/bin/../lib/slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop-0.20/bin/../lib/xmlenc-0.52.jar:/usr/lib/hadoop-0.20/bin/../lib/jsp-2.1/jsp-2.1.jar:/usr/lib/hadoop-0.20/bin/../lib/jsp-2.1/jsp-api-2.1.jar:/data1/tmp/hadoop/taskTracker/vrushali_channapattan/jobcache/job_201202071551_0129/jars/classes:/data1/tmp/hadoop/taskTracker/vrushali_channapattan/jobcache/job_201202071551_0129/jars/job.jar:/data4/tmp/hadoop/taskTracker/vrushali_channapattan/jobcache/job_201202071551_0129/attempt_201202071551_0129_m_000000_2/work > > >I have been breaking my head against this since yesterday :( > ________________________________ From: Suraj Varma <[email protected]> To: [email protected]; Vrushali C <[email protected]> Sent: Wednesday, February 8, 2012 5:34 PM Subject: Re: hbase - CassNotFound while connecting through mapper No - that's what libjars is for ... it will copy it to the distributed cache. Can you check where you are passing in the libjars parameter ... it should come _after_ your map reduce driver name in the script. i.e. .... <MyMapReduceDriver> -libjars <comma separate lib jars> .. My suspicion is that you provided the libjars argument ahead of the map reduce driver class in your script ... and hence it didn't take effect. http://www.cloudera.com/blog/2011/01/how-to-include-third-party-libraries-in-your-map-reduce-job/ If the above is not the case ... I would suggest you first try out one of the hbase out of the box mapreduce programs and get that working on your cluster (e.g. rowcounter, etc) and see how the command line looks in that case. Then follow suit for your custom map reduce program. --Suraj On Wed, Feb 8, 2012 at 12:38 PM, Vrushali C <[email protected]> wrote: > thanks for the response Suraj! > > yes i checked the value being set and i removed all wild cards > /usr/lib/hbase/conf:/usr/java/default/lib/tools.jar:/usr/lib/hbase:/usr/lib/hbase/hbase-0.90.1-cdh3u0.jar:/usr/lib/hbase/hbase-0.90.1-cdh3u0-tests.jar:/usr/lib/hbase/lib/activation-1.1.jar:/usr/lib/hbase/lib/asm-3.1.jar:/usr/lib/hbase/lib/avro-1.3.3.jar:/usr/lib/hbase/lib/commons-cli-1.2.jar:/usr/lib/hbase/lib/commons-codec-1.4.jar:/usr/lib/hbase/lib/commons-el-1.0.jar:/usr/lib/hbase/lib/commons-httpclient-3.1.jar:/usr/lib/hbase/lib/commons-lang-2.5.jar:/usr/lib/hbase/lib/commons-logging-1.1.1.jar:/usr/lib/hbase/lib/commons-net-1.4.1.jar:/usr/lib/hbase/lib/core-3.1.1.jar:/usr/lib/hbase/lib/guava-r06.jar:/usr/lib/hbase/lib/hadoop-core.jar:/usr/lib/hbase/lib/hbase-0.90.1-cdh3u0.jar:/usr/lib/hbase/lib/jackson-core-asl-1.5.2.jar:/usr/lib/hbase/lib/jackson-jaxrs-1.5.5.jar:/usr/lib/hbase/lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hbase/lib/jackson-xc-1.5.5.jar:/usr/lib/hbase/lib/jasper-compiler-5.5.23.jar:/usr/lib/hbase/lib/jasper-runtime-5.5.23.jar:/usr/lib/h > base/lib/jaxb-api-2.1.jar:/usr/lib/hbase/lib/jaxb-impl-2.1.12.jar:/usr/lib/hbase/lib/jersey-core-1.4.jar:/usr/lib/hbase/lib/jersey-json-1.4.jar:/usr/lib/hbase/lib/jersey-server-1.4.jar:/usr/lib/hbase/lib/jettison-1.1.jar:/usr/lib/hbase/lib/jetty-6.1.26.jar:/usr/lib/hbase/lib/jetty-util-6.1.26.jar:/usr/lib/hbase/lib/jruby-complete-1.0.3.jar:/usr/lib/hbase/lib/jsp-2.1-6.1.14.jar:/usr/lib/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/lib/hbase/lib/jsp-api-2.1.jar:/usr/lib/hbase/lib/jsr311-api-1.1.1.jar:/usr/lib/hbase/lib/log4j-1.2.16.jar:/usr/lib/hbase/lib/protobuf-java-2.3.0.jar:/usr/lib/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hbase/lib/servlet-api-2.5.jar:/usr/lib/hbase/lib/slf4j-api-1.5.8.jar:/usr/lib/hbase/lib/slf4j-log4j12-1.5.8.jar:/usr/lib/hbase/lib/stax-api-1.0.1.jar:/usr/lib/hbase/lib/thrift-0.2.0.jar:/usr/lib/hbase/lib/xmlenc-0.52.jar:/usr/lib/hbase/lib/zookeeper.jar:/usr/lib/zookeeper/zookeeper-3.3.3-cdh3u0.jar:/usr/lib/hadoop/hadoop-0.20.2-cdh3u0 > -core.jar > > > > Also when the mapper runs it prints the environment as > > 12/02/08 12:31:09 INFO zookeeper.ZooKeeper: Client > environment:java.class.path=/usr/lib/hadoop-0.20/conf:/usr/java/default/lib/tools.jar:/usr/lib/hadoop-0.20:/usr/lib/hadoop-0.20/hadoop-core-0.20.2-cdh3u0.jar:/usr/lib/hadoop-0.20/lib/aes.jar:/usr/lib/hadoop-0.20/lib/ant-contrib-1.0b3.jar:/usr/lib/hadoop-0.20/lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop-0.20/lib/aspectjtools-1.6.5.jar:/usr/lib/hadoop-0.20/lib/commons-cli-1.2.jar:/usr/lib/hadoop-0.20/lib/commons-codec-1.4.jar:/usr/lib/hadoop-0.20/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop-0.20/lib/commons-el-1.0.jar:/usr/lib/hadoop-0.20/lib/commons-httpclient-3.0.1.jar:/usr/lib/hadoop-0.20/lib/commons-logging-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-net-1.4.1.jar:/usr/lib/hadoop-0.20/lib/core-3.1.1.jar:/usr/lib/hadoop-0.20/lib/hadoop-capacity-scheduler-0.20.2-cdh3u0.jar:/usr/lib/hadoop-0.20/lib/hadoop-datajoin-0.20.2-cdh3u0.jar:/usr/lib/hadoop-0.20/lib/hadoop-failmon-0.20.2-cdh3u0.jar:/usr/lib/hadoop- > 0.20/lib/hadoop-fairscheduler-0.20.2-cdh3u0.jar:/usr/lib/hadoop-0.20/lib/hadoop-gridmix-0.20.2-cdh3u0.jar:/usr/lib/hadoop-0.20/lib/hadoop-index-0.20.2-cdh3u0.jar:/usr/lib/hadoop-0.20/lib/hadoop-lzo-0.4.10.jar:/usr/lib/hadoop-0.20/lib/hadoop-mrunit-0.20.2-cdh3u0.jar:/usr/lib/hadoop-0.20/lib/hadoop-streaming-0.20.2-cdh3u0.jar:/usr/lib/hadoop-0.20/lib/hadoop-thriftfs-0.20.2-cdh3u0.jar:/usr/lib/hadoop-0.20/lib/hadoop-vaidya-0.20.2-cdh3u0.jar:/usr/lib/hadoop-0.20/lib/hdfsproxy-2.0.jar:/usr/lib/hadoop-0.20/lib/hsqldb-1.8.0.10.jar:/usr/lib/hadoop-0.20/lib/jackson-core-asl-1.5.2.jar:/usr/lib/hadoop-0.20/lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hadoop-0.20/lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop-0.20/lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop-0.20/lib/jets3t-0.6.1.jar:/usr/lib/hadoop-0.20/lib/jetty-6.1.26.jar:/usr/lib/hadoop-0.20/lib/jetty-servlet-tester-6.1.26.jar:/usr/lib/hadoop-0.20/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-0.20/lib/jsch-0.1.42.jar > :/usr/lib/hadoop-0.20/lib/junit-4.5.jar:/usr/lib/hadoop-0.20/lib/kfs-0.2.2.jar:/usr/lib/hadoop-0.20/lib/libthrift.jar:/usr/lib/hadoop-0.20/lib/log4j-1.2.15.jar:/usr/lib/hadoop-0.20/lib/lucene-core-2.3.1.jar:/usr/lib/hadoop-0.20/lib/mockito-all-1.8.2.jar:/usr/lib/hadoop-0.20/lib/mysql-connector-java-5.1.15-bin.jar:/usr/lib/hadoop-0.20/lib/nzjdbc.jar:/usr/lib/hadoop-0.20/lib/oro-2.0.8.jar:/usr/lib/hadoop-0.20/lib/servlet-api-2.5-20081211.jar:/usr/lib/hadoop-0.20/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hadoop-0.20/lib/slf4j-api-1.4.3.jar:/usr/lib/hadoop-0.20/lib/slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop-0.20/lib/thrift-fb303-0.5.0.jar:/usr/lib/hadoop-0.20/lib/xmlenc-0.52.jar:/usr/lib/hadoop-0.20/lib/jsp-2.1/jsp-2.1.jar:/usr/lib/hadoop-0.20/lib/jsp-2.1/jsp-api-2.1.jar:/usr/lib/zookeeper/zookeeper-3.3.3-cdh3u0.jar:/usr/lib/hbase/hbase-0.90.1-cdh3u0.jar:/usr/lib/hadoop/hadoop-0.20.2-cdh3u0-core.jar:/usr/lib/hbase/hbase-0.90.1-cdh3u0-tests.jar:/usr/lib/hbase/co nf/ > > > > Please help !! Should i be copying over these jars to distributed cache > myself? > > thanks > Vrushali > > > > ________________________________ > From: Suraj Varma <[email protected]> > To: [email protected]; Vrushali C <[email protected]> > Sent: Wednesday, February 8, 2012 12:26 AM > Subject: Re: hbase - CassNotFound while connecting through mapper > > Perhaps your HADOOP_CLASSPATH is not getting set properly. > >>> export HADOOP_CLASSPATH=`hbase classpath`:$ZK_CLASSPATH:$HADOOP_CLASSPATH > > Can you set the absolute path to hbase above? Also - try echo-ing the > hadoop classpath to ensure that HADOOP_CLASSPATH indeed has the hbase > jars & conf directory. > --Suraj > > On Tue, Feb 7, 2012 at 11:31 PM, Vrushali C <[email protected]> wrote: >> >> >> I am trying to connect to an hbase table through a mapper's setup method. >> But it throws a class not found exception. >> >> I have been reading the forums and understood I should be checking if hbase >> is on the hadoop classpath. I also looked at >> >> http://hbase.apache.org/docs/current/api/org/apache/hadoop/hbase/mapreduce/package-summary.html#classpath >> >> According, I have tried both with setting -libjars as well as hadoop >> classpath. I expanded all the jars in `hbase classpath` and mentioned them >> on the -libjars option. >> >> I still get a Error: java.lang.ClassNotFoundException: >> org.apache.hadoop.hbase.HBaseConfiguration >> or a >> >> Error: java.lang.ClassNotFoundException: >> org.apache.hadoop.hbase.client.HTable >> if I try to access HBaseConfiguration or HTable in the mapper code. >> >> Attempt at setting classpath >> export ZK_HOME=/usr/lib/zookeeper >> export ZK_CLASSPATH=$ZK_HOME/zookeeper-3.3.3-cdh3u0.jar >> export HADOOP_CLASSPATH=`hbase classpath`:$ZK_CLASSPATH:$HADOOP_CLASSPATH >> >> attempt at including in libjars. >> >> -libjars >> > /usr/lib/hbase/conf,/usr/java/default/lib/tools.jar,/usr/lib/hbase,/usr/lib/hbase/hbase-0.90.1-cdh3u0.jar,/usr/lib/hbase/hbase-0.90.1-cdh3u0-tests.jar,/usr/lib/hbase/lib/activation-1.1.jar,/usr/lib/hbase/lib/asm-3.1.jar,/usr/lib/hbase/lib/avro-1.3.3.jar,/usr/lib/hbase/lib/commons-cli-1.2.jar,/usr/lib/hbase/lib/commons-codec-1.4.jar,/usr/lib/hbase/lib/commons-el-1.0.jar,/usr/lib/hbase/lib/commons-httpclient-3.1.jar,/usr/lib/hbase/lib/commons-lang-2.5.jar,/usr/lib/hbase/lib/commons-logging-1.1.1.jar,/usr/lib/hbase/lib/commons-net-1.4.1.jar,/usr/lib/hbase/lib/core-3.1.1.jar,/usr/lib/hbase/lib/guava-r06.jar,/usr/lib/hbase/lib/hadoop-core.jar,/usr/lib/hbase/lib/hbase-0.90.1-cdh3u0.jar,/usr/lib/hbase/lib/jackson-core-asl-1.5.2.jar,/usr/lib/hbase/lib/jackson-jaxrs-1.5.5.jar,/usr/lib/hbase/lib/jackson-mapper-asl-1.5.2.jar,/usr/lib/hbase/lib/jackson-xc-1.5.5.jar,/usr/lib/hbase/lib/jasper-compiler-5.5.23.jar,/usr/lib/hbase/lib/jasper-runtime-5.5.23.jar,/usr/ lib/ >> hbase/lib/jaxb-api-2.1.jar,/usr/lib/hbase/lib/jaxb-impl-2.1.12.jar,/usr/lib/hbase/lib/jersey-core-1.4.jar,/usr/lib/hbase/lib/jersey-json-1.4.jar,/usr/lib/hbase/lib/jersey-server-1.4.jar,/usr/lib/hbase/lib/jettison-1.1.jar,/usr/lib/hbase/lib/jetty-6.1.26.jar,/usr/lib/hbase/lib/jetty-util-6.1.26.jar,/usr/lib/hbase/lib/jruby-complete-1.0.3.jar,/usr/lib/hbase/lib/jsp-2.1-6.1.14.jar,/usr/lib/hbase/lib/jsp-api-2.1-6.1.14.jar,/usr/lib/hbase/lib/jsp-api-2.1.jar,/usr/lib/hbase/lib/jsr311-api-1.1.1.jar,/usr/lib/hbase/lib/log4j-1.2.16.jar,/usr/lib/hbase/lib/protobuf-java-2.3.0.jar,/usr/lib/hbase/lib/servlet-api-2.5-6.1.14.jar,/usr/lib/hbase/lib/servlet-api-2.5.jar,/usr/lib/hbase/lib/slf4j-api-1.5.8.jar,/usr/lib/hbase/lib/slf4j-log4j12-1.5.8.jar,/usr/lib/hbase/lib/stax-api-1.0.1.jar,/usr/lib/hbase/lib/thrift-0.2.0.jar,/usr/lib/hbase/lib/xmlenc-0.52.jar,/usr/lib/hbase/lib/zookeeper.jar,/etc/zookeeper,/etc/hadoop-0.20/conf,/usr/lib/hbase/hbase-0.90.1-cdh3u0.jar,/usr/ >> lib/hbase/conf,/usr/lib/hbase/conf,/usr/lib/hbase/hbase-0.90.1-cdh3u0.jar,/usr/lib/hadoop/hadoop-0.20.2-cdh3u0-core.jar,/usr/lib/zookeeper/zookeeper-3.3.3-cdh3u0.jar >> >> Would appreciate any help!! >> >> >> Stack trace: >> 12/02/07 23:24:21 INFO mapred.JobClient: Task Id : >> attempt_201202071551_0032_m_000000_2, Status : FAILED >> Error: java.lang.ClassNotFoundException: >> org.apache.hadoop.hbase.HBaseConfiguration >> at java.net.URLClassLoader$1.run(URLClassLoader.java:202) >> at java.security.AccessController.doPrivileged(Native Method) >> at java.net.URLClassLoader.findClass(URLClassLoader.java:190) >> at java.lang.ClassLoader.loadClass(ClassLoader.java:307) >> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) >> at java.lang.ClassLoader.loadClass(ClassLoader.java:248) >> at java.lang.Class.forName0(Native Method) >> at java.lang.Class.forName(Class.java:247) >> at >> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:943) >> at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:994) >> at >> org.apache.hadoop.mapreduce.JobContext.getMapperClass(JobContext.java:212) >> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:601) >> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:322) >> at org.apache.hadoop.mapred.Child$4.run(Child.java:268) >> at java.security.AccessController.doPrivileged(Native Method) >> at javax.security.auth.Subject.doAs(Subject.java:396) >> at >> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115) >> at org.apache.hadoop.mapred.Child.main(Child.java:262) >> >> >> Thanks >> Vrushali
