Hi Guys, Any update on this?
Caused by: com.sleepycat.je.LockTimeoutException: (JE 5.0.73) Lock expired. Locker 768997492 3544_NotificationHookConsumer thread-0_Txn: waited for lock on database=edgestore LockAddr:1860143233 LSN=0x0/0xc8613 type=WRITE grant=WAIT_PROMOTION timeoutMillis=500 startTime=1502275039421 endTime=1502275039921 Owners: [<LockInfo locker="768997492 3544_NotificationHookConsumer thread-0_Txn" type="READ"/>, <LockInfo locker="1176127601 3312_pool-1-thread-7_Txn" type="READ"/>] Waiters: [] Transaction 768997492 3544_NotificationHookConsumer thread-0_Txn owns LockAddr:1860143233 <LockInfo locker="768997492 3544_NotificationHookConsumer thread-0_Txn" type="READ"/> Transaction 768997492 3544_NotificationHookConsumer thread-0_Txn waits for LockAddr:1860143233 at com.sleepycat.je.txn.LockManager.newLockTimeoutException(LockManager.java:664) at com.sleepycat.je.txn.LockManager.makeTimeoutMsgInternal(LockManager.java:623) at com.sleepycat.je.txn.SyncedLockManager.makeTimeoutMsg(SyncedLockManager.java:97) I am not able to generate lineage due to this issue. Thanks! On Fri, Aug 4, 2017 at 5:05 PM, Vineet Mishra <clearmido...@gmail.com> wrote: > Hi Team, > > I am using Atlas 0.8 release branch with hive 2.1 which works fine with > hive import and with simple CTAS on hive but as when I give CTAS with some > joins, its start failing to create lineage(the metadata still is captured > by atlas) > > Also when I am running the same stuff with hive 1.1.0 it doesn't even get > the metadata in the latter case, looks like there some persistence issue as > what logs depicts. Has any one been through this situation? > > Atlas log with hive 2.1 > ------------------------------- > 2017-08-03 05:31:24,011 ERROR - [NotificationHookConsumer thread-0:] ~ > Could not commit transaction [71] due to storage exception in commit > (StandardTitanGraph:673) > com.thinkaurelius.titan.core.TitanException: Could not execute operation > due to backend exception > at com.thinkaurelius.titan.diskstorage.util. > BackendOperation.execute(BackendOperation.java:44) > at com.thinkaurelius.titan.diskstorage.keycolumnvalue. > cache.CacheTransaction.persist(CacheTransaction.java:86) > at com.thinkaurelius.titan.diskstorage.keycolumnvalue. > cache.CacheTransaction.flushInternal(CacheTransaction.java:140) > . > . > . > . > Caused by: com.thinkaurelius.titan.diskstorage.PermanentBackendException: > Permanent failure in storage backend > at com.thinkaurelius.titan.diskstorage.berkeleyje. > BerkeleyJEKeyValueStore.insert(BerkeleyJEKeyValueStore.java:206) > . > . > . > . > Caused by: com.sleepycat.je.LockTimeoutException: (JE 5.0.73) Lock > expired. Locker 2046519996 19950_NotificationHookConsumer thread-0_Txn: > waited for lock on database=edgestore LockAddr:2100569783 LSN=0x0/0xeb475 > type=WRITE grant=WAIT_PROMOTION timeoutMillis=500 startTime=1501738283510 > endTime=1501738284010 > Owners: [<LockInfo locker="1619574649 19922_pool-1-thread-4_Txn" > type="READ"/>, <LockInfo locker="846918696 19916_pool-1-thread-7_Txn" > type="READ"/>, <LockInfo locker="2046519996 19950_NotificationHookConsumer > thread-0_Txn" type="READ"/>] > Waiters: [] > Transaction 2046519996 19950_NotificationHookConsumer thread-0_Txn owns > LockAddr:2100569783 <LockInfo locker="2046519996 > 19950_NotificationHookConsumer thread-0_Txn" type="READ"/> > Transaction 2046519996 19950_NotificationHookConsumer thread-0_Txn waits > for LockAddr:2100569783 > > > Atlas log With hive 1.1 > ------------------------------ > 2017-08-04 11:19:15,533 ERROR - [Thread-10:] ~ Evicted [2@7f00010114224- > or1010051029126-corp-adobe-com1] from cache but waiting too long for > transactions to close. Stale transaction alert on: [standardtitantx[null]] > (ManagementLogger$SendAckOnTxClose:189) > 2017-08-04 11:19:15,537 ERROR - [Thread-11:] ~ Evicted [3@7f00010114224- > or1010051029126-corp-adobe-com1] from cache but waiting too long for > transactions to close. Stale transaction alert on: [standardtitantx[null]] > (ManagementLogger$SendAckOnTxClose:189) > > > Any help would be highly appreciated. > > Thanks! >