Do you have the hbase conf directory on the HADOOP_CLASSPATH ?

On Mon, Jul 12, 2010 at 7:57 AM, Gal Barnea <[email protected]> wrote:

>  Hi everyone,
>
>
>
> I’m trying to setup HBase/Hive integration using HBase 0.20.3 and the
> latest HIVE source from svn.
>
> I have one machine acting as HBase Master+ZooKeeper with HIVE on it and two
> RegionServers.
>
>
>
> I am able to CREATE/DROP the external table in HIVE pointing to HBase, and
> SELECT from it after inserting data via hbase shell.
>
>
>
> However when trying to run the following I get the exception below:
>
>
>
> hive --auxpath
> $HIVE_SRC/build/hbase-handler/hive_hbase-handler.jar,$HIVE_SRC/hbase-handler/lib/hbase-0.20.3.jar,$HIVE_SRC/hbase-handler/lib/zookeeper-3.2.2.jar
> -hiveconf hbase.master=Master1:60000
>
>
>
> >CREATE TABLE pokes (foo STRING, bar STRING);
>
>
>
> >LOAD DATA LOCAL INPATH '/root/hive/data/files/kv1.txt' OVERWRITE INTO
> TABLE pokes;
>
>
>
> >CREATE TABLE hbase_hive(key string, value string)
>
>   STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
>
>   WITH SERDEPROPERTIES   ("hbase.columns.mapping" = ":key,cf1:val");
>
>
>
> >insert overwrite table hbase_hive  select * from pokes where foo="97";
>
>
>
>
>
> This fails miserably, and in Hadoop JobTracker I can see
>
> java.lang.RuntimeException: Hive Runtime Error while closing operators
>
>         at 
> org.apache.hadoop.hive.ql.exec.ExecMapper.close(ExecMapper.java:248)
>
>         at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
>
>         at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358)
>
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
>
>         at org.apache.hadoop.mapred.Child.main(Child.java:170)
>
> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: 
> org.apache.hadoop.hbase.client.NoServerForRegionException: Timed out trying 
> to locate root region
>
>         at 
> org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:231)
>
>         at 
> org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:487)
>
>         at 
> org.apache.hadoop.hive.ql.exec.FileSinkOperator.closeOp(FileSinkOperator.java:632)
>
>         at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:540)
>
>         at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:549)
>
>         at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:549)
>
>         at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:549)
>
>         at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:549)
>
>         at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:549)
>
>         at 
> org.apache.hadoop.hive.ql.exec.ExecMapper.close(ExecMapper.java:225)
>
>         ... 4 more
>
> Caused by: org.apache.hadoop.hbase.client.NoServerForRegionException: Timed 
> out trying to locate root region
>
>         at 
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.locateRootRegion(HConnectionManager.java:976)
>
>         at 
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.locateRegion(HConnectionManager.java:625)
>
>         at 
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.relocateRegion(HConnectionManager.java:607)
>
>         at 
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.locateRegionInMeta(HConnectionManager.java:738)
>
>         at 
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.locateRegion(HConnectionManager.java:634)
>
>         at 
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.relocateRegion(HConnectionManager.java:607)
>
>         at 
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.locateRegionInMeta(HConnectionManager.java:738)
>
>         at 
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.locateRegion(HConnectionManager.java:638)
>
>         at 
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.locateRegion(HConnectionManager.java:601)
>
>         at org.apache.hadoop.hbase.client.HTable.(HTable.java:128)
>
>         at org.apache.hadoop.hbase.client.HTable.(HTable.java:106)
>
>         at 
> org.apache.hadoop.hive.hbase.HiveHBaseTableOutputFormat.getHiveRecordWriter(HiveHBaseTableOutputFormat.java:75)
>
>         at 
> org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:240)
>
>         at 
> org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:228)
>
>         ... 13 more
>
>
>
>
>
> Can someone please shed some light on this?
>
>
>
> Thanks in advance
>
> Gal
>
>
>

Reply via email to