[ 
https://issues.apache.org/jira/browse/ATLAS-537?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15177348#comment-15177348
 ] 

Shwetha G S commented on ATLAS-537:
-----------------------------------

I didn't see the classpath issue in the environment. However, 
{noformat}
java.lang.NullPointerException
        at java.util.ArrayList.addAll(ArrayList.java:577)
        at 
org.apache.atlas.falcon.hook.FalconHook.createProcessInstance(FalconHook.java:238)
        at 
org.apache.atlas.falcon.hook.FalconHook.createEntities(FalconHook.java:180)
        at 
org.apache.atlas.falcon.hook.FalconHook.fireAndForget(FalconHook.java:174)
        at 
org.apache.atlas.falcon.hook.FalconHook.access$200(FalconHook.java:68)
        at org.apache.atlas.falcon.hook.FalconHook$2.run(FalconHook.java:157)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
{noformat}
is an issue and happens when input/output is hdfs based feed

> Falcon hook failing when tried to submit a process which creates a hive table.
> ------------------------------------------------------------------------------
>
>                 Key: ATLAS-537
>                 URL: https://issues.apache.org/jira/browse/ATLAS-537
>             Project: Atlas
>          Issue Type: Bug
>    Affects Versions: trunk
>            Reporter: Ayub Khan
>            Assignee: Shwetha G S
>            Priority: Blocker
>         Attachments: logs.tar.gz
>
>
> Falcon hook failing when tried to submit a hive process.
> Stack trace from log:
> {noformat}
> 2016-02-25 11:40:38,894 INFO  - [479730212@qtp-989447607-2 - 
> 9f45adf1-2264-420a-8a4d-9d7a729f34b8:hrt_qa:POST//entities/submit/process] ~ 
> PROCESS/Aae61bc53-c2ee37c0123 is published into config store (AUDIT:229)
> 2016-02-25 11:40:38,894 INFO  - [479730212@qtp-989447607-2 - 
> 9f45adf1-2264-420a-8a4d-9d7a729f34b8:hrt_qa:POST//entities/submit/process] ~ 
> Submit successful: (process): Aae61bc53-c2ee37c0123 
> (AbstractEntityManager:417)
> 2016-02-25 11:40:38,895 INFO  - [479730212@qtp-989447607-2 - 
> 9f45adf1-2264-420a-8a4d-9d7a729f34b8:hrt_qa:POST//entities/submit/process] ~ 
> {Action:submit, Dimensions:{colo=NULL, entityType=process}, Status: 
> SUCCEEDED, Time-taken:350536678 ns} (METRIC:38)
> 2016-02-25 11:40:38,896 DEBUG - [479730212@qtp-989447607-2 - 
> 9f45adf1-2264-420a-8a4d-9d7a729f34b8:] ~ Audit: hrt_qa/172.22.101.126 
> performed request 
> http://apathan-atlas-erie-tp-testing-3.novalocal:15000/api/entities/submit/process
>  (172.22.101.123) at time 2016-02-25T11:40Z (FalconAuditFilter:86)
> 2016-02-25 11:40:38,896 INFO  - [Atlas Logger 0:] ~ Entered Atlas hook for 
> Falcon hook operation ADD_PROCESS (FalconHook:167)
> 2016-02-25 11:40:39,625 INFO  - [Atlas Logger 0:] ~ 0: Opening raw store with 
> implemenation class:org.apache.hadoop.hive.metastore.ObjectStore 
> (HiveMetaStore:590)
> 2016-02-25 11:40:39,650 INFO  - [Atlas Logger 0:] ~ ObjectStore, initialize 
> called (ObjectStore:294)
> 2016-02-25 11:40:39,965 INFO  - [Atlas Logger 0:] ~ Property 
> hive.metastore.integral.jdo.pushdown unknown - will be ignored 
> (Persistence:77)
> 2016-02-25 11:40:39,966 INFO  - [Atlas Logger 0:] ~ Property 
> datanucleus.cache.level2 unknown - will be ignored (Persistence:77)
> 2016-02-25 11:40:41,822 WARN  - [Atlas Logger 0:] ~ Retrying creating default 
> database after error: Unexpected exception caught. (HiveMetaStore:623)
> javax.jdo.JDOFatalInternalException: Unexpected exception caught.
>       at 
> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1193)
>       at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
>       at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
>       at 
> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:374)
>       at 
> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:403)
>       at 
> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:296)
>       at 
> org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:263)
>       at 
> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
>       at 
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
>       at 
> org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
>       at 
> org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:594)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:572)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:621)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:462)
>       at 
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
>       at 
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5789)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
>       at 
> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
>       at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>       at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>       at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>       at 
> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1551)
>       at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
>       at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
>       at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
>       at 
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3000)
>       at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3019)
>       at 
> org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1237)
>       at 
> org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
>       at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
>       at 
> org.apache.atlas.hive.bridge.HiveMetaStoreBridge.<init>(HiveMetaStoreBridge.java:85)
>       at 
> org.apache.atlas.falcon.hook.FalconHook.registerFalconDataModel(FalconHook.java:334)
>       at 
> org.apache.atlas.falcon.hook.FalconHook.fireAndForget(FalconHook.java:170)
>       at 
> org.apache.atlas.falcon.hook.FalconHook.access$200(FalconHook.java:68)
>       at org.apache.atlas.falcon.hook.FalconHook$2.run(FalconHook.java:157)
>       at 
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>       at java.lang.Thread.run(Thread.java:745)
> NestedThrowablesStackTrace:
> java.lang.reflect.InvocationTargetException
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:497)
>       at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
>       at java..AccessController.doPrivileged(Native Method)
>       at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
>       at 
> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
>       at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
>       at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
>       at 
> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:374)
>       at 
> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:403)
>       at 
> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:296)
>       at 
> org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:263)
>       at 
> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
>       at 
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
>       at 
> org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
>       at 
> org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:594)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:572)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:621)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:462)
>       at 
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
>       at 
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5789)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
>       at 
> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
>       at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>       at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>       at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>       at 
> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1551)
>       at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
>       at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
>       at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
>       at 
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3000)
>       at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3019)
>       at 
> org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1237)
>       at 
> org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
>       at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
>       at 
> org.apache.atlas.hive.bridge.HiveMetaStoreBridge.<init>(HiveMetaStoreBridge.java:85)
>       at 
> org.apache.atlas.falcon.hook.FalconHook.registerFalconDataModel(FalconHook.java:334)
>       at 
> org.apache.atlas.falcon.hook.FalconHook.fireAndForget(FalconHook.java:170)
>       at 
> org.apache.atlas.falcon.hook.FalconHook.access$200(FalconHook.java:68)
>       at org.apache.atlas.falcon.hook.FalconHook$2.run(FalconHook.java:157)
>       at 
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>       at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.AssertionError: java.lang.NoSuchMethodException: 
> com.google.common.base.internal.Finalizer.startFinalizer(java.lang.Class, 
> java.lang.ref.ReferenceQueue, java.lang.ref.PhantomReference)
>       at 
> com.google.common.base.FinalizableReferenceQueue.getStartFinalizer(FinalizableReferenceQueue.java:308)
>       at 
> com.google.common.base.FinalizableReferenceQueue.<clinit>(FinalizableReferenceQueue.java:90)
>       at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:427)
>       at 
> com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
>       at 
> org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
>       at 
> org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
>       at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>       at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>       at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>       at 
> org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
>       at 
> org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
>       at 
> org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
>       at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
>       at 
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
>       at 
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
>       at 
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
>       ... 50 more
> Caused by: java.lang.NoSuchMethodException: 
> com.google.common.base.internal.Finalizer.startFinalizer(java.lang.Class, 
> java.lang.ref.ReferenceQueue, java.lang.ref.PhantomReference)
>       at java.lang.Class.getMethod(Class.java:1786)
>       at 
> com.google.common.base.FinalizableReferenceQueue.getStartFinalizer(FinalizableReferenceQueue.java:302)
>       ... 66 more
> 2016-02-25 11:40:41,831 INFO  - [Atlas Logger 0:] ~ 0: Opening raw store with 
> implemenation class:org.apache.hadoop.hive.metastore.ObjectStore 
> (HiveMetaStore:590)
> 2016-02-25 11:40:41,835 INFO  - [Atlas Logger 0:] ~ ObjectStore, initialize 
> called (ObjectStore:294)
> 2016-02-25 11:40:41,885 INFO  - [Atlas Logger 0:] ~ Property 
> hive.metastore.integral.jdo.pushdown unknown - will be ignored 
> (Persistence:77)
> 2016-02-25 11:40:41,885 INFO  - [Atlas Logger 0:] ~ Property 
> datanucleus.cache.level2 unknown - will be ignored (Persistence:77)
> 2016-02-25 11:40:41,959 WARN  - [Atlas Logger 0:] ~ Failed to access 
> metastore. This class should not accessed in runtime. (Hive:168)
> org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: 
> Unable to instantiate 
> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
>       at 
> org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1239)
>       at 
> org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
>       at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
>       at 
> org.apache.atlas.hive.bridge.HiveMetaStoreBridge.<init>(HiveMetaStoreBridge.java:85)
>       at 
> org.apache.atlas.falcon.hook.FalconHook.registerFalconDataModel(FalconHook.java:334)
>       at 
> org.apache.atlas.falcon.hook.FalconHook.fireAndForget(FalconHook.java:170)
>       at 
> org.apache.atlas.falcon.hook.FalconHook.access$200(FalconHook.java:68)
>       at org.apache.atlas.falcon.hook.FalconHook$2.run(FalconHook.java:157)
>       at 
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>       at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.RuntimeException: Unable to instantiate 
> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
>       at 
> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1553)
>       at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
>       at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
>       at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
>       at 
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3000)
>       at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3019)
>       at 
> org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1237)
>       ... 12 more
> Caused by: java.lang.reflect.InvocationTargetException
>       at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>       at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>       at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>       at 
> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1551)
>       ... 18 more
> Caused by: javax.jdo.JDOFatalInternalException: Unexpected exception caught.
> NestedThrowables:
> java.lang.reflect.InvocationTargetException
>       at 
> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1193)
>       at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
>       at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
>       at 
> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:374)
>       at 
> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:403)
>       at 
> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:296)
>       at 
> org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:263)
>       at 
> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
>       at 
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
>       at 
> org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
>       at 
> org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:594)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:572)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:625)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:462)
>       at 
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
>       at 
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5789)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
>       at 
> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
>       ... 23 more
> Caused by: java.lang.reflect.InvocationTargetException
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:497)
>       at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
>       at java..AccessController.doPrivileged(Native Method)
>       at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
>       at 
> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
>       ... 42 more
> Caused by: java.lang.NoClassDefFoundError: Could not initialize class 
> com.google.common.base.FinalizableReferenceQueue
>       at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:427)
>       at 
> com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
>       at 
> org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
>       at 
> org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
>       at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>       at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>       at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>       at 
> org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
>       at 
> org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
>       at 
> org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
>       at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
>       at 
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
>       at 
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
>       at 
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
>       ... 50 more
> 2016-02-25 11:40:42,092 INFO  - [Atlas Logger 0:] ~ Real User: falcon 
> (auth:SIMPLE), is from ticket cache? false (SecureClientUtils:90)
> 2016-02-25 11:40:42,092 INFO  - [Atlas Logger 0:] ~ doAsUser: falcon 
> (SecureClientUtils:93)
> 2016-02-25 11:40:42,464 INFO  - [Atlas Logger 0:] ~ Hive data model is 
> already registered! (HiveMetaStoreBridge:489)
> 2016-02-25 11:40:42,470 INFO  - [Atlas Logger 0:] ~ Registering Falcon data 
> model (FalconHook:339)
> 2016-02-25 11:40:42,470 INFO  - [Atlas Logger 0:] ~ Generating the Falcon 
> Data Model (FalconDataModelGenerator:74)
> 2016-02-25 11:40:43,739 INFO  - [Atlas Logger 0:] ~ Creating process Instance 
> : Aae61bc53-c2ee37c0123 (FalconHook:223)
> 2016-02-25 11:40:43,740 INFO  - [Atlas Logger 0:] ~ Atlas hook failed 
> (FalconHook:159)
> java.lang.NullPointerException
>       at java.util.ArrayList.addAll(ArrayList.java:577)
>       at 
> org.apache.atlas.falcon.hook.FalconHook.createProcessInstance(FalconHook.java:238)
>       at 
> org.apache.atlas.falcon.hook.FalconHook.createEntities(FalconHook.java:180)
>       at 
> org.apache.atlas.falcon.hook.FalconHook.fireAndForget(FalconHook.java:174)
>       at 
> org.apache.atlas.falcon.hook.FalconHook.access$200(FalconHook.java:68)
>       at org.apache.atlas.falcon.hook.FalconHook$2.run(FalconHook.java:157)
>       at 
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>       at java.lang.Thread.run(Thread.java:745)
> 2016-02-25 11:41:47,898 INFO  - [Thread-11:] ~ config.location is set, using: 
> /usr/hdp/current/falcon-server/conf/runtime.properties 
> (ApplicationProperties:108)
> 2016-02-25 11:41:47,901 INFO  - [Thread-11:] ~ Initializing 
> org.apache.falcon.util.RuntimeProperties properties with domain falcon 
> (ApplicationProperties:145)
> 2016-02-25 11:41:47,902 DEBUG - [Thread-11:] ~ 
> log.cleanup.frequency.days.retention=days(7) (ApplicationProperties:151)
> 2016-02-25 11:41:47,902 DEBUG - [Thread-11:] ~ 
> log.cleanup.frequency.minutes.retention=hours(6) (ApplicationProperties:151)
> 2016-02-25 11:41:47,902 DEBUG - [Thread-11:] ~ domain=falcon 
> (ApplicationProperties:151)
> 2016-02-25 11:41:47,903 DEBUG - [Thread-11:] ~ feed.late.frequency=minutes(3) 
> (ApplicationProperties:151)
> 2016-02-25 11:41:47,903 DEBUG - [Thread-11:] ~ 
> workflow.status.retry.count=150 (ApplicationProperties:151)
> 2016-02-25 11:41:47,903 DEBUG - [Thread-11:] ~ 
> log.cleanup.frequency.months.retention=months(3) (ApplicationProperties:151)
> 2016-02-25 11:41:47,904 DEBUG - [Thread-11:] ~ 
> log.cleanup.frequency.hours.retention=minutes(1) (ApplicationProperties:151)
> 2016-02-25 11:41:47,904 INFO  - [Thread-11:] ~ config.location is set, using: 
> /usr/hdp/current/falcon-server/conf/runtime.properties 
> (ApplicationProperties:108)
> 2016-02-25 11:41:47,905 INFO  - [Thread-11:] ~ Initializing 
> org.apache.falcon.util.RuntimeProperties properties with domain falcon 
> (ApplicationProperties:145)
> 2016-02-25 11:41:47,905 DEBUG - [Thread-11:] ~ 
> log.cleanup.frequency.days.retention=days(7) (ApplicationProperties:151)
> 2016-02-25 11:41:47,905 DEBUG - [Thread-11:] ~ 
> log.cleanup.frequency.minutes.retention=hours(6) (ApplicationProperties:151)
> 2016-02-25 11:41:47,906 DEBUG - [Thread-11:] ~ domain=falcon 
> (ApplicationProperties:151)
> 2016-02-25 11:41:47,906 DEBUG - [Thread-11:] ~ feed.late.frequency=minutes(3) 
> (ApplicationProperties:151)
> 2016-02-25 11:41:47,906 DEBUG - [Thread-11:] ~ 
> workflow.status.retry.count=150 (ApplicationProperties:151)
> 2016-02-25 11:41:47,906 DEBUG - [Thread-11:] ~ 
> log.cleanup.frequency.months.retention=months(3) (ApplicationProperties:151)
> 2016-02-25 11:41:47,907 DEBUG - [Thread-11:] ~ 
> log.cleanup.frequency.hours.retention=minutes(1) (ApplicationProperties:151)
> 2016-02-25 11:43:22,262 INFO  - [1531301613@qtp-989447607-6 - 
> bcfe7646-e25d-418a-a7ed-5c330aebde29:] ~ HttpServletRequest RemoteUser is 
> falcon (Servlets:47)
> 2016-02-25 11:43:22,265 INFO  - [1531301613@qtp-989447607-6 - 
> bcfe7646-e25d-418a-a7ed-5c330aebde29:falcon:GET//admin/clearuser] ~ Logging 
> in falcon (CurrentUser:65)
> 2016-02-25 11:43:22,265 INFO  - [1531301613@qtp-989447607-6 - 
> bcfe7646-e25d-418a-a7ed-5c330aebde29:falcon:GET//admin/clearuser] ~ Request 
> from authenticated user: falcon, URL=/api/admin/clearuser?user.name=falcon, 
> doAs user: null (FalconAuthenticationFilter:185)
> 2016-02-25 11:43:22,266 INFO  - [1531301613@qtp-989447607-6 - 
> bcfe7646-e25d-418a-a7ed-5c330aebde29:falcon:GET//admin/clearuser] ~ 
> Authorizing user=falcon against request=RequestParts{resource='admin', 
> action='clearuser'} (FalconAuthorizationFilter:78)
> 2016-02-25 11:43:22,267 INFO  - [1531301613@qtp-989447607-6 - 
> bcfe7646-e25d-418a-a7ed-5c330aebde29:falcon:GET//admin/clearuser] ~ 
> Authorization succeeded for user=falcon, proxy=falcon 
> (FalconAuthorizationFilter:88)
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to