Thanks Cheolsoo, I figured I need to put a lot of hbase jars or jars hbase depend on through -D option. Instead, I put all the hbase lib's jar to HADOOP_CLASSPATH. It worked. I got the idea from this page. http://mikaelsitruk.wordpress.com/2011/09/15/pig-and-hbase-integration/. I had trouble to follow exact steps. But I put the following at the end of hadoop-env.sh to make hadoop aware of hbase: *HBASE_JARS=* *for f in $HBASE_HOME/lib/*.jar; do* * HBASE_JARS=${HBASE_JARS}:$f;* *done* *export HADOOP_CLASSPATH=$HBASE_JARS:$HADOOP_CLASSPATH*
Regards, Jack On Wed, Nov 21, 2012 at 7:56 PM, Cheolsoo Park <[email protected]> wrote: > Hi Jack, > > PIG_CLASSPATH doesn't ship jar files to the back-end, but it only adds them > to classpath in the front-end. > > java.lang.ClassNotFoundException: com.google.protobuf.Message > > You should make protobuf available in mappers in the back-end. Please try > to pass it via -Dpig.additional.jars=<path to protobuf jar> in your Pig > command. This will add the protobuf jar to distributed cache as well as > classpath in mappers. > > Thanks, > Cheolsoo > > > > On Tue, Nov 20, 2012 at 1:48 PM, Jinyuan Zhou <[email protected] > >wrote: > > > Hi, > > I am using org.apache.pig.backend.hadoop.hbase.HBaseStorage to load from > > hbase table in pig. it works in local mode. But when I was trying do it > in > > mapreduce mode. The mappers got the CalssNotFoundException. > > > > [main] ERROR org.apache.pig.tools.grunt.GruntParser - ERROR 2997: Unable > to > > recreate exception from backed error: Error: > > java.lang.ClassNotFoundException: com.google.protobuf.Message > > > > I installed hadoop, hbase, hive through brew on my mac osx mountain lion. > > added hbase jar into PIG_CLASSPATH to be able to read load from hbase > table > > i local mode. Seems I am still missing some > > Thanks, > > Jack > > > -- -- Jinyuan (Jack) Zhou
