Hemanth Yamijala created ATLAS-415:
--------------------------------------
Summary: Hive import fails when importing a table that is already
imported without StorageDescriptor information
Key: ATLAS-415
URL: https://issues.apache.org/jira/browse/ATLAS-415
Project: Atlas
Issue Type: Bug
Reporter: Hemanth Yamijala
I found this when testing patches that integrate Storm with Atlas, but guess
this may occur in other scenarios as well.
To reproduce:
* Run a storm topology with Atlas Hook enabled that has a HiveBolt (requires
patches for ATLAS-181 and friends).
* Run hive-import following the above.
The first step creates a Hive DB and table setting just the required
attributes. Note that the StorageDescriptor is an optional attribute as per the
Hive DataModel now.
The second step fails with this exception:
{code}
Exception in thread "main" java.lang.NullPointerException
at
org.apache.atlas.hive.bridge.HiveMetaStoreBridge.getSDForTable(HiveMetaStoreBridge.java:345)
at
org.apache.atlas.hive.bridge.HiveMetaStoreBridge.importTables(HiveMetaStoreBridge.java:219)
at
org.apache.atlas.hive.bridge.HiveMetaStoreBridge.importDatabases(HiveMetaStoreBridge.java:104)
at
org.apache.atlas.hive.bridge.HiveMetaStoreBridge.importHiveMetadata(HiveMetaStoreBridge.java:96)
at
org.apache.atlas.hive.bridge.HiveMetaStoreBridge.main(HiveMetaStoreBridge.java:503)
{code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)