[ 
https://issues.apache.org/jira/browse/ATLAS-2288?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Nixon Rodrigues updated ATLAS-2288:
-----------------------------------
    Attachment: ATLAS-2284-branch-0.8.patch
                ATLAS-2284-master.patch

Fix for this issue was to add hbase-common jar in atlas-hive-plugin-impl while 
packaging.

> NoClassDefFoundError Exception while running import-hive script when hbase 
> table is created via Hive
> ----------------------------------------------------------------------------------------------------
>
>                 Key: ATLAS-2288
>                 URL: https://issues.apache.org/jira/browse/ATLAS-2288
>             Project: Atlas
>          Issue Type: Bug
>            Reporter: Nixon Rodrigues
>         Attachments: ATLAS-2284-branch-0.8.patch, ATLAS-2284-master.patch
>
>
> Import hive fails with NoClassDefFoundError when it try to hbase table 
> created via hive.
> {code}
>       CREATE TABLE 'hbase_table_1'( 'key' int COMMENT 'from deserializer', 
> 'value' string COMMENT 'from deserializer') ROW FORMAT SERDE 
> 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 
> 'org.apache.hadoop.hive.hbase.HBaseStorageHandler WITH SERDEPROPERTIES ( 
> 'hbase.columns.mapping'=':key,cf1:val', 'serialization.format'='1') 
> TBLPROPERTIES ( 'hbase.table.name'='def', 
> 'transient_lastDdlTime'='1483707502'); 

> {code}
> {noformat}
> Exception in thread "main" java.lang.NoClassDefFoundError: 
> org/apache/hadoop/hbase/util/Bytes 
> at 
> org.apache.hadoop.hive.hbase.HBaseSerDe.parseColumnsMapping(HBaseSerDe.java:184)
>  
> at 
> org.apache.hadoop.hive.hbase.HBaseSerDeParameters.<init>(HBaseSerDeParameters.java:73)
>  
> at org.apache.hadoop.hive.hbase.HBaseSerDe.initialize(HBaseSerDe.java:117) 
> at 
> org.apache.hadoop.hive.serde2.AbstractSerDe.initialize(AbstractSerDe.java:54) 
> at 
> org.apache.hadoop.hive.serde2.SerDeUtils.initializeSerDe(SerDeUtils.java:521) 
> at 
> org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:410)
>  
> at 
> org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:397)
>  
> at 
> org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:278)
>  
> at org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:260) 
> at org.apache.hadoop.hive.ql.metadata.Table.getColsInternal(Table.java:622) 
> at org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:605) 
> at 
> org.apache.atlas.hive.bridge.HiveMetaStoreBridge.createOrUpdateTableInstance(HiveMetaStoreBridge.java:488)
>  
> at 
> org.apache.atlas.hive.bridge.HiveMetaStoreBridge.createTableInstance(HiveMetaStoreBridge.java:424)
>  
> at 
> org.apache.atlas.hive.bridge.HiveMetaStoreBridge.registerTable(HiveMetaStoreBridge.java:505)
>  
> at 
> org.apache.atlas.hive.bridge.HiveMetaStoreBridge.importTable(HiveMetaStoreBridge.java:289)
>  
> at 
> org.apache.atlas.hive.bridge.HiveMetaStoreBridge.importTables(HiveMetaStoreBridge.java:272)
>  
> at 
> org.apache.atlas.hive.bridge.HiveMetaStoreBridge.importDatabases(HiveMetaStoreBridge.java:143)
>  
> {noformat}
> The fix for this issue is to copy *hbase-common.jar* file into Hive hook 
> atlas-hive-plugin-impl/ directory. 



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Reply via email to