I made a new Hive build with the following libraries
zookeeper-3.3.1+10.jar ,
hbase-0.89.20100924+28.jar
hbase-0.89.20100924+28-tests.jar

Used the following Hive setting to connect:
/bin/hive --auxpath
/lib/hive_hbase-handler.jar,/lib/hbase-0.89.20100924+28.jar,/lib/zookeeper-3.3.1+10.jar
-hiveconf hbase.zookeeper.property.clientPort=2181 -hiveconf
hbase.zookeeper.quorum=hadoop1.hbapi.com,hadoop2.hbapi.com,hadoop3.hbapi.com

I was able to connect and read the HBase tables, but not able to insert new
keys or load data into Hive table.

-ray

2010-10-26 00:57:43,177 INFO
org.apache.hadoop.hive.ql.exec.FileSinkOperator: New Final Path: FS
/user/hive/warehouse/tmp_hbase_t1
2010-10-26 00:57:43,291 FATAL ExecMapper: java.lang.NoSuchMethodError:
org.apache.hadoop.hbase.client.HTable.<init>(Lorg/apache/hadoop/conf/Configuration;Ljava/lang/String;)V
        at 
org.apache.hadoop.hive.hbase.HiveHBaseTableOutputFormat.getHiveRecordWriter(HiveHBaseTableOutputFormat.java:80)
        at 
org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:242)
        at 
org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:230)
        at 
org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:461)
        at 
org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:507)
        at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:457)
        at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:697)
        at 
org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84)
        at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:457)
        at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:697)
        at 
org.apache.hadoop.hive.ql.exec.TableScanOperator.processOp(TableScanOperator.java:66)
        at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:457)
        at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:697)
        at 
org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:464)
        at org.apache.hadoop.hive.ql.exec.ExecMapper.map(ExecMapper.java:180)
        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:383)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:317)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:217)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1063)

at org.apache.hadoop.mapred.Child.main(Child.java:211)

On Mon, Oct 25, 2010 at 2:36 PM, Ray Duong <ray.du...@gmail.com> wrote:

> Thanks Edward,
>
> I'll rebuild the Hive with the same exact library as the HBase and Hadoop
> clients.
>
> I'll let you know how the integration testing goes and if we find any other
> issues.
>
> -ray
>
>
>
> On Mon, Oct 25, 2010 at 11:16 AM, Edward Capriolo 
> <edlinuxg...@gmail.com>wrote:
>
>> On Mon, Oct 25, 2010 at 2:03 PM, Ray Duong <ray.du...@gmail.com> wrote:
>> > Thank Youngwoo,
>> > I checked out the last Hive build from trunk which contains the Hive1264
>> > patch.  I was able to get pass last error message and able to query the
>> > Hbase table.  However, when I try to copy the data from Hbase into a
>> Hive
>> > table, I get the following error message.  Does anyone know what this
>> error
>> > message is related to and what additional steps I can do to resolve it?
>> > Greatly appreciated
>> > -ray
>> >
>> > 2010-10-25 04:19:07,406 INFO org.apache.hadoop.util.NativeCodeLoader:
>> Loaded
>> > the native-hadoop library
>> > 2010-10-25 04:19:07,579 WARN org.apache.hadoop.conf.Configuration:
>> >
>> /data/data04/hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201010231249_0013/attempt_201010231249_0013_m_000002_0/job.xml:a
>> > attempt to override final parameter: fs.trash.interval;  Ignoring.
>> > 2010-10-25 04:19:07,581 WARN org.apache.hadoop.conf.Configuration:
>> >
>> /data/data04/hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201010231249_0013/attempt_201010231249_0013_m_000002_0/job.xml:a
>> > attempt to override final parameter: mapred.local.dir;  Ignoring.
>> > 2010-10-25 04:19:07,581 WARN org.apache.hadoop.conf.Configuration:
>> >
>> /data/data04/hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201010231249_0013/attempt_201010231249_0013_m_000002_0/job.xml:a
>> > attempt to override final parameter:
>> > mapred.tasktracker.reduce.tasks.maximum;  Ignoring.
>> > 2010-10-25 04:19:07,582 WARN org.apache.hadoop.conf.Configuration:
>> >
>> /data/data04/hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201010231249_0013/attempt_201010231249_0013_m_000002_0/job.xml:a
>> > attempt to override final parameter: fs.checkpoint.dir;  Ignoring.
>> > 2010-10-25 04:19:07,585 WARN org.apache.hadoop.conf.Configuration:
>> >
>> /data/data04/hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201010231249_0013/attempt_201010231249_0013_m_000002_0/job.xml:a
>> > attempt to override final parameter:
>> mapred.tasktracker.map.tasks.maximum;
>> > Ignoring.
>> > 2010-10-25 04:19:07,587 WARN org.apache.hadoop.conf.Configuration:
>> >
>> /data/data04/hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201010231249_0013/attempt_201010231249_0013_m_000002_0/job.xml:a
>> > attempt to override final parameter:
>> > hadoop.rpc.socket.factory.class.default;  Ignoring.
>> > 2010-10-25 04:19:07,589 WARN org.apache.hadoop.conf.Configuration:
>> >
>> /data/data04/hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201010231249_0013/attempt_201010231249_0013_m_000002_0/job.xml:a
>> > attempt to override final parameter: mapred.child.ulimit;  Ignoring.
>> > 2010-10-25 04:19:07,592 WARN org.apache.hadoop.conf.Configuration:
>> >
>> /data/data04/hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201010231249_0013/attempt_201010231249_0013_m_000002_0/job.xml:a
>> > attempt to override final parameter: tasktracker.http.threads;
>>  Ignoring.
>> > 2010-10-25 04:19:07,592 WARN org.apache.hadoop.conf.Configuration:
>> >
>> /data/data04/hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201010231249_0013/attempt_201010231249_0013_m_000002_0/job.xml:a
>> > attempt to override final parameter: mapred.job.tracker.handler.count;
>> > Ignoring.
>> > 2010-10-25 04:19:07,592 WARN org.apache.hadoop.conf.Configuration:
>> >
>> /data/data04/hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201010231249_0013/attempt_201010231249_0013_m_000002_0/job.xml:a
>> > attempt to override final parameter: hadoop.tmp.dir;  Ignoring.
>> > 2010-10-25 04:19:07,625 INFO org.apache.hadoop.mapred.TaskRunner:
>> Creating
>> > symlink:
>> >
>> /data/data03/hadoop/mapred/local/taskTracker/distcache/-3601043851674017223_220470838_1652071132/
>> hadoop229.qwapi.com/tmp/hive-hadoop/hive_2010-10-24_23-19-13_728_5598498044419457333/-mr-10004/e870a146-897b-428a-b55c-6ae93d8dc47a
>> > <-
>> >
>> /data02/hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201010231249_0013/attempt_201010231249_0013_m_000002_0/work/./HIVE_PLANe870a146-897b-428a-b55c-6ae93d8dc47a
>> > 2010-10-25 04:19:07,637 INFO
>> > org.apache.hadoop.filecache.TrackerDistributedCacheManager: Creating
>> > symlink:
>> >
>> /data/data04/hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201010231249_0013/jars/.job.jar.crc
>> > <-
>> >
>> /data02/hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201010231249_0013/attempt_201010231249_0013_m_000002_0/work/./.job.jar.crc
>> > 2010-10-25 04:19:07,643 INFO
>> > org.apache.hadoop.filecache.TrackerDistributedCacheManager: Creating
>> > symlink:
>> >
>> /data/data04/hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201010231249_0013/jars/job.jar
>> > <-
>> >
>> /data02/hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201010231249_0013/attempt_201010231249_0013_m_000002_0/work/./job.jar
>> > 2010-10-25 04:19:07,654 INFO org.apache.hadoop.metrics.jvm.JvmMetrics:
>> > Initializing JVM Metrics with processName=MAP, sessionId=
>> > 2010-10-25 04:19:07,754 WARN org.apache.hadoop.conf.Configuration:
>> >
>> /data/data04/hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201010231249_0013/attempt_201010231249_0013_m_000002_0/job.xml:a
>> > attempt to override final parameter: fs.trash.interval;  Ignoring.
>> > 2010-10-25 04:19:07,755 WARN org.apache.hadoop.conf.Configuration:
>> >
>> /data/data04/hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201010231249_0013/attempt_201010231249_0013_m_000002_0/job.xml:a
>> > attempt to override final parameter: mapred.local.dir;  Ignoring.
>> > 2010-10-25 04:19:07,756 WARN org.apache.hadoop.conf.Configuration:
>> >
>> /data/data04/hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201010231249_0013/attempt_201010231249_0013_m_000002_0/job.xml:a
>> > attempt to override final parameter:
>> > mapred.tasktracker.reduce.tasks.maximum;  Ignoring.
>> > 2010-10-25 04:19:07,756 WARN org.apache.hadoop.conf.Configuration:
>> >
>> /data/data04/hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201010231249_0013/attempt_201010231249_0013_m_000002_0/job.xml:a
>> > attempt to override final parameter: dfs.data.dir;  Ignoring.
>> > 2010-10-25 04:19:07,756 WARN org.apache.hadoop.conf.Configuration:
>> >
>> /data/data04/hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201010231249_0013/attempt_201010231249_0013_m_000002_0/job.xml:a
>> > attempt to override final parameter: fs.checkpoint.dir;  Ignoring.
>> > 2010-10-25 04:19:07,758 WARN org.apache.hadoop.conf.Configuration:
>> >
>> /data/data04/hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201010231249_0013/attempt_201010231249_0013_m_000002_0/job.xml:a
>> > attempt to override final parameter: dfs.permissions;  Ignoring.
>> > 2010-10-25 04:19:07,758 WARN org.apache.hadoop.conf.Configuration:
>> >
>> /data/data04/hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201010231249_0013/attempt_201010231249_0013_m_000002_0/job.xml:a
>> > attempt to override final parameter: dfs.block.size;  Ignoring.
>> > 2010-10-25 04:19:07,758 WARN org.apache.hadoop.conf.Configuration:
>> >
>> /data/data04/hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201010231249_0013/attempt_201010231249_0013_m_000002_0/job.xml:a
>> > attempt to override final parameter:
>> mapred.tasktracker.map.tasks.maximum;
>> > Ignoring.
>> > 2010-10-25 04:19:07,759 WARN org.apache.hadoop.conf.Configuration:
>> >
>> /data/data04/hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201010231249_0013/attempt_201010231249_0013_m_000002_0/job.xml:a
>> > attempt to override final parameter:
>> > hadoop.rpc.socket.factory.class.default;  Ignoring.
>> > 2010-10-25 04:19:07,759 WARN org.apache.hadoop.conf.Configuration:
>> >
>> /data/data04/hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201010231249_0013/attempt_201010231249_0013_m_000002_0/job.xml:a
>> > attempt to override final parameter: dfs.namenode.handler.count;
>>  Ignoring.
>> > 2010-10-25 04:19:07,760 WARN org.apache.hadoop.conf.Configuration:
>> >
>> /data/data04/hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201010231249_0013/attempt_201010231249_0013_m_000002_0/job.xml:a
>> > attempt to override final parameter: mapred.child.ulimit;  Ignoring.
>> > 2010-10-25 04:19:07,762 WARN org.apache.hadoop.conf.Configuration:
>> >
>> /data/data04/hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201010231249_0013/attempt_201010231249_0013_m_000002_0/job.xml:a
>> > attempt to override final parameter: tasktracker.http.threads;
>>  Ignoring.
>> > 2010-10-25 04:19:07,762 WARN org.apache.hadoop.conf.Configuration:
>> >
>> /data/data04/hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201010231249_0013/attempt_201010231249_0013_m_000002_0/job.xml:a
>> > attempt to override final parameter: mapred.job.tracker.handler.count;
>> > Ignoring.
>> > 2010-10-25 04:19:07,762 WARN org.apache.hadoop.conf.Configuration:
>> >
>> /data/data04/hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201010231249_0013/attempt_201010231249_0013_m_000002_0/job.xml:a
>> > attempt to override final parameter: hadoop.tmp.dir;  Ignoring.
>> > 2010-10-25 04:19:07,762 WARN org.apache.hadoop.conf.Configuration:
>> >
>> /data/data04/hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201010231249_0013/attempt_201010231249_0013_m_000002_0/job.xml:a
>> > attempt to override final parameter: dfs.datanode.du.reserved;
>>  Ignoring.
>> > 2010-10-25 04:19:07,763 WARN org.apache.hadoop.conf.Configuration:
>> >
>> /data/data04/hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201010231249_0013/attempt_201010231249_0013_m_000002_0/job.xml:a
>> > attempt to override final parameter: dfs.name.dir;  Ignoring.
>> > 2010-10-25 04:19:07,763 WARN org.apache.hadoop.conf.Configuration:
>> >
>> /data/data04/hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201010231249_0013/attempt_201010231249_0013_m_000002_0/job.xml:a
>> > attempt to override final parameter: dfs.datanode.handler.count;
>>  Ignoring.
>> > 2010-10-25 04:19:08,054 FATAL org.apache.hadoop.mapred.Child: Error
>> running
>> > child : java.lang.NoSuchMethodError:
>> >
>> org.apache.hadoop.hbase.client.HTable.<init>(Lorg/apache/hadoop/conf/Configuration;[B)V
>> >       at
>> >
>> org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getRecordReader(HiveHBaseTableInputFormat.java:88)
>> >       at
>> >
>> org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(HiveInputFormat.java:233)
>> >       at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:363)
>> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:317)
>> >       at org.apache.hadoop.mapred.Child$4.run(Child.java:217)
>> >       at java.security.AccessController.doPrivileged(Native Method)
>> >       at javax.security.auth.Subject.doAs(Subject.java:396)
>> >       at
>> >
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1063)
>> >       at org.apache.hadoop.mapred.Child.main(Child.java:211)
>> >
>> >
>> >
>> >
>> >
>> >
>> >
>> > On Sun, Oct 24, 2010 at 7:39 PM, 김영우 <warwit...@gmail.com> wrote:
>> >>
>> >> Hi ray,
>> >>
>> >> https://issues.apache.org/jira/browse/HIVE-1264
>> >>
>> >> You should upgrade Hive to 0.7(trunk). CDH3beta3 includes hadoop
>> security
>> >> features.
>> >>
>> >> - Youngwoo
>> >>
>> >> 2010/10/25 Ray Duong <ray.du...@gmail.com>
>> >>>
>> >>> Hi,
>> >>> I'm getting the following error message after upgrading to CDH3b3.
>>  Does
>> >>> anyone know how to resolve this?
>> >>> Thanks
>> >>> -ray
>> >>>
>> >>> hive> show tables;
>> >>>
>> >>> Exception in thread "main" java.lang.NoClassDefFoundError:
>> >>> org/apache/hadoop/security/UnixUserGroupInformation
>> >>>
>> >>>        at
>> >>>
>> org.apache.hadoop.hive.ql.processors.CommandProcessorFactory.get(CommandProcessorFactory.java:63)
>> >>>
>> >>>        at
>> >>> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:133)
>> >>>
>> >>>        at
>> >>> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:199)
>> >>>
>> >>>        at
>> org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:353)
>> >>>
>> >>>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >>>
>> >>>        at
>> >>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> >>>
>> >>>        at
>> >>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> >>>
>> >>>        at java.lang.reflect.Method.invoke(Method.java:597)
>> >>>
>> >>>        at org.apache.hadoop.util.RunJar.main(RunJar.java:186)
>> >>>
>> >>> Caused by: java.lang.ClassNotFoundException:
>> >>> org.apache.hadoop.security.UnixUserGroupInformation
>> >>>
>> >>>        at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
>> >>>
>> >>>        at java.security.AccessController.doPrivileged(Native Method)
>> >>>
>> >>>        at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
>> >>>
>> >>>        at java.lang.ClassLoader.loadClass(ClassLoader.java:303)
>> >>>
>> >>>        at
>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>> >>>
>> >>>        at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
>> >>>
>> >>>        at
>> java.lang.ClassLoader.loadClassInternal(ClassLoader.java:316)
>> >>>
>> >>>        ... 9 more
>> >
>> >
>> Whenever you see things such as:
>>
>> 2010-10-25 04:19:08,054 FATAL org.apache.hadoop.mapred.Child: Error
>> running child : java.lang.NoSuchMethodError:
>>
>> org.apache.hadoop.hbase.client.HTable.<init>(Lorg/apache/hadoop/conf/Configuration;[B)V
>>        at
>> org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getRecordReader(HiveHBaseTableInputFormat.java:88)
>>
>> This usually indicates the software compiled against one library is
>> using a method with a signature that has changed in a new library.
>>
>> Because hbase, hadoop, and hive are active products and hadoop is
>> being branched by several entities these issues are more likely to
>> happen.
>>
>> In most cases (with hadoop as well as other software) you can happily
>> mix and match packages happily, but in this case you are getting bit
>> by it.
>>
>> So in a nutshell the problem is you have a mismatch somewhere. In your
>> case it is probably easiest to rebuild hive as rebuilding hbase or
>> hadoop means cluster wide modifications.
>>
>> When you build hive make sure you are building the hbase-handler
>> against the version of hbase you are using in production. Hive
>> probably builds against an older hbase then the one you are using.
>>
>
>

Reply via email to