Re: [Error]: Spark 1.5.2 + HiveHbase Integration

2016-02-29 Thread mohit.kaushik

Don't you think, you need a HBase jar?

On 02/29/2016 03:18 PM, Divya Gehlot wrote:

Hi,
I am trying to access hive table which been created using 
HbaseIntegration 


I am able to access data in Hive CLI
But when I am trying to access the table using hivecontext of Spark
getting following error

java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/util/Bytes
at

org.apache.hadoop.hive.hbase.HBaseSerDe.parseColumnsMapping(HBaseSerDe.java:184)
at

org.apache.hadoop.hive.hbase.HBaseSerDeParameters.(HBaseSerDeParameters.java:73)
at
org.apache.hadoop.hive.hbase.HBaseSerDe.initialize(HBaseSerDe.java:117)
at

org.apache.hadoop.hive.serde2.AbstractSerDe.initialize(AbstractSerDe.java:53)
at

org.apache.hadoop.hive.serde2.SerDeUtils.initializeSerDe(SerDeUtils.java:521)
at

org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:391)
at

org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:276)
at
org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:258)
at
org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:605)
at

org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1$$anonfun$3.apply(ClientWrapper.scala:330)
at

org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1$$anonfun$3.apply(ClientWrapper.scala:325)



Have added following jars to Spark class path :
/usr/hdp/2.3.4.0-3485/hive/lib/hive-hbase-handler.jar,
/usr/hdp/2.3.4.0-3485/hive/lib/zookeeper-3.4.6.2.3.4.0-3485.jar,
/usr/hdp/2.3.4.0-3485/hive/lib/guava-14.0.1.jar,
/usr/hdp/2.3.4.0-3485/hive/lib/protobuf-java-2.5.0.jar

Which jar files  am I missing ??


Thanks,
Regards,
Divya



--
Signature

*Mohit Kaushik*
Software Engineer
A Square,Plot No. 278, Udyog Vihar, Phase 2, Gurgaon 122016, India
*Tel:*+91 (124) 4969352 | *Fax:*+91 (124) 4033553

interactive social intelligence at work...

 
 
  


 ... ensuring Assurance in complexity and uncertainty

/This message including the attachments, if any, is a confidential 
business communication. If you are not the intended recipient it may be 
unlawful for you to read, copy, distribute, disclose or otherwise use 
the information in this e-mail. If you have received it in error or are 
not the intended recipient, please destroy it and notify the sender 
immediately. Thank you /




[Error]: Spark 1.5.2 + HiveHbase Integration

2016-02-29 Thread Divya Gehlot
Hi,
I am trying to access hive table which been created using HbaseIntegration

I am able to access data in Hive CLI
But when I am trying to access the table using hivecontext of Spark
getting following error

> java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/util/Bytes
> at
> org.apache.hadoop.hive.hbase.HBaseSerDe.parseColumnsMapping(HBaseSerDe.java:184)
> at
> org.apache.hadoop.hive.hbase.HBaseSerDeParameters.(HBaseSerDeParameters.java:73)
> at
> org.apache.hadoop.hive.hbase.HBaseSerDe.initialize(HBaseSerDe.java:117)
> at
> org.apache.hadoop.hive.serde2.AbstractSerDe.initialize(AbstractSerDe.java:53)
> at
> org.apache.hadoop.hive.serde2.SerDeUtils.initializeSerDe(SerDeUtils.java:521)
> at
> org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:391)
> at
> org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:276)
> at
> org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:258)
> at org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:605)
> at
> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1$$anonfun$3.apply(ClientWrapper.scala:330)
> at
> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1$$anonfun$3.apply(ClientWrapper.scala:325)



Have added following jars to Spark class path :
/usr/hdp/2.3.4.0-3485/hive/lib/hive-hbase-handler.jar,
/usr/hdp/2.3.4.0-3485/hive/lib/zookeeper-3.4.6.2.3.4.0-3485.jar,
/usr/hdp/2.3.4.0-3485/hive/lib/guava-14.0.1.jar,
/usr/hdp/2.3.4.0-3485/hive/lib/protobuf-java-2.5.0.jar

Which jar files  am I missing ??


Thanks,
Regards,
Divya