Here are some details from the meta store logs:

018-06-08T03:34:20,634 ERROR [pool-13-thread-197([])]:
metastore.RetryingHMSHandler (RetryingHMSHandler.java:invokeInternal(204))
- java.lang.IllegalStateException: Unexpected DataOperationType: UNSET
agentInfo=Unknown txnid:130551
    at
org.apache.hadoop.hive.metastore.txn.TxnHandler.enqueueLockWithRetry(TxnHandler.java:1000)
    at
org.apache.hadoop.hive.metastore.txn.TxnHandler.lock(TxnHandler.java:872)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.lock(HiveMetaStore.java:6366)
    at sun.reflect.GeneratedMethodAccessor11.invoke(Unknown Source)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at
org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:148)
    at
org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107)
    at com.sun.proxy.$Proxy32.lock(Unknown Source)
    at
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$lock.getResult(ThriftHiveMetastore.java:14155)
    at
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$lock.getResult(ThriftHiveMetastore.java:14139)
    at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
    at
org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110)
    at
org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:106)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
    at
org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118)
    at
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
    at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)



Here are some details about the environment

Source :

Storm Topology version : 1.1.1
storm-hive : version 1.1.1
mvn dependency plugin show following depedencies

Hive :

*[INFO]    org.apache.hive.shims:hive-shims-0.23:jar:0.14.0:runtime*

*[INFO]    org.apache.hive:hive-ant:jar:0.14.0:compile*

*[INFO]    org.apache.hive:hive-metastore:jar:0.14.0:compile*

*[INFO]    org.apache.hive:hive-shims:jar:0.14.0:compile*

*[INFO]    org.apache.hive:hive-cli:jar:0.14.0:compile*

*[INFO]    org.apache.hive:hive-exec:jar:0.14.0:compile*

*[INFO]
org.apache.hive.shims:hive-shims-common-secure:jar:0.14.0:compile*

*[INFO]    org.apache.hive.shims:hive-shims-common:jar:0.14.0:compile*

*[INFO]    org.apache.hive:hive-common:jar:0.14.0:compile*

*[INFO]    org.apache.hive.shims:hive-shims-0.20S:jar:0.14.0:runtime*

*[INFO]    org.apache.hive.shims:hive-shims-0.20:jar:0.14.0:runtime*

*[INFO]
org.apache.hive.hcatalog:hive-hcatalog-streaming:jar:0.14.0:compile*

*[INFO]    org.apache.hive:hive-serde:jar:0.14.0:compile*

*[INFO]    org.apache.storm:storm-hive:jar:1.1.1:compile*

*[INFO]    org.apache.hive:hive-service:jar:0.14.0:compile*

*[INFO]    org.apache.hive.hcatalog:hive-hcatalog-core:jar:0.14.0:compile*


Hadoop :

*org.apache.hadoop:hadoop-mapreduce-client-core:jar:2.6.1:compile*

*[INFO]    org.apache.hadoop:hadoop-yarn-common:jar:2.6.1:compile*

*[INFO]    org.apache.hadoop:hadoop-common:jar:2.6.1:compile*

*[INFO]
org.apache.hadoop:hadoop-mapreduce-client-jobclient:jar:2.6.1:compile*

*[INFO]    org.apache.hadoop:hadoop-yarn-api:jar:2.6.1:compile*

*[INFO]    org.apache.hadoop:hadoop-client:jar:2.6.1:compile*

*[INFO]    org.apache.hadoop:hadoop-auth:jar:2.6.1:compile*

*[INFO]    org.apache.hadoop:hadoop-yarn-client:jar:2.6.1:compile*

*[INFO]    org.apache.hadoop:hadoop-mapreduce-client-app:jar:2.6.1:compile*

*[INFO]
org.apache.hadoop:hadoop-mapreduce-client-common:jar:2.6.1:compile*

*[INFO]    org.apache.hadoop:hadoop-annotations:jar:2.6.1:compile*

*[INFO]    org.apache.hadoop:hadoop-hdfs:jar:2.6.1:compile*

*[INFO]
org.apache.hadoop:hadoop-mapreduce-client-shuffle:jar:2.6.1:compile*

*[INFO]    org.apache.hadoop:hadoop-yarn-server-common:jar:2.6.1:compile*

and  HDFS

*[INFO]    org.apache.hadoop:hadoop-hdfs:jar:2.6.1:compile*

Sink:
Hive EMR

Hadoop version :

*hadoop@ip-10-0-6-16 ~]$ hadoop version*

*Hadoop 2.8.3-amzn-0*

Hive version :

*[hadoop@ip-10-0-6-16 ~]$ hive --version*

*Hive 2.3.2-amzn-2*


Any inconsistency leading to such an error ?


On Thu, Jun 7, 2018 at 7:35 PM, Roshan Naik <roshan_n...@yahoo.com> wrote:

> The lock issue seems to be happening on the Metastore end and surfacing
> via the API.
> Partition creation is working but the API is unable to acquire a TxnBatch
> from the metastore due to the lock issue.
> Check the hive metastore logs and see why the locks are failing.
> Roshan
>
>
>
> Sent from Yahoo Mail for iPhone
> <https://overview.mail.yahoo.com/?.src=iOS>
>
>
> On Thursday, June 7, 2018, 11:08 AM, Milind Vaidya <kava...@gmail.com>
> wrote:
>
> Hi
>
> I am using storm and strom-hive version 1.1.1 to store data directly to
> hive cluster.
>
> After using mvn shade plugin and overcoming few other errors I am now
> stuck at this point.
>
> The strange thing observed was few partitions were created but the data
> was not inserted.
>
> *dt=17688/platform=site/country=SG/entity_id=abcd*
>
> *dt=17688/platform=site/country=SG/entity_id=asdlfa*
>
> *dt=17688/platform=site/country=SG/entity_id=asdq13*
>
> *dt=17688/platform=site/country=SG/entity_id=123124*
>
>
> What are my debugging options here ? ( some data from log is removed
> intentionally)
>
>
> 2018-06-07 16:35:22.459 h.metastore 
> Thread-12-users-by-song-hive-bolt-executor[5 5] [INFO] Connected to metastore.
> 2018-06-07 16:35:22.545 o.a.s.h.b.HiveBolt 
> Thread-12-users-by-song-hive-bolt-executor[5 5] [ERROR] Failed to create 
> HiveWriter for endpoint: { }
> org.apache.storm.hive.common.HiveWriter$ConnectFailure: Failed connecting to 
> EndPoint {metaStoreUri='', database='', table='', partitionVals=[] }
>       at org.apache.storm.hive.common.HiveWriter.<init>(HiveWriter.java:80) 
> ~[stormjar.jar:?]
>       at 
> org.apache.storm.hive.common.HiveUtils.makeHiveWriter(HiveUtils.java:50) 
> ~[stormjar.jar:?]
>       at 
> org.apache.storm.hive.bolt.HiveBolt.getOrCreateWriter(HiveBolt.java:262) 
> [stormjar.jar:?]
>       at org.apache.storm.hive.bolt.HiveBolt.execute(HiveBolt.java:112) 
> [stormjar.jar:?]
>       at 
> org.apache.storm.daemon.executor$fn__5030$tuple_action_fn__5032.invoke(executor.clj:729)
>  [storm-core-1.1.1.jar:1.1.1]
>       at 
> org.apache.storm.daemon.executor$mk_task_receiver$fn__4951.invoke(executor.clj:461)
>  [storm-core-1.1.1.jar:1.1.1]
>       at 
> org.apache.storm.disruptor$clojure_handler$reify__4465.onEvent(disruptor.clj:40)
>  [storm-core-1.1.1.jar:1.1.1]
>       at 
> org.apache.storm.utils.DisruptorQueue.consumeBatchToCursor(DisruptorQueue.java:482)
>  [storm-core-1.1.1.jar:1.1.1]
>       at 
> org.apache.storm.utils.DisruptorQueue.consumeBatchWhenAvailable(DisruptorQueue.java:460)
>  [storm-core-1.1.1.jar:1.1.1]
>       at 
> org.apache.storm.disruptor$consume_batch_when_available.invoke(disruptor.clj:73)
>  [storm-core-1.1.1.jar:1.1.1]
>       at 
> org.apache.storm.daemon.executor$fn__5030$fn__5043$fn__5096.invoke(executor.clj:848)
>  [storm-core-1.1.1.jar:1.1.1]
>       at org.apache.storm.util$async_loop$fn__557.invoke(util.clj:484) 
> [storm-core-1.1.1.jar:1.1.1]
>       at clojure.lang.AFn.run(AFn.java:22) [clojure-1.7.0.jar:?]
>       at java.lang.Thread.run(Thread.java:745) [?:1.7.0_131]
> Caused by: org.apache.storm.hive.common.HiveWriter$TxnBatchFailure: Failed 
> acquiring Transaction Batch from EndPoint: {metaStoreUri='', database='', 
> table='', partitionVals=[, , , ] }
>       at 
> org.apache.storm.hive.common.HiveWriter.nextTxnBatch(HiveWriter.java:264) 
> ~[stormjar.jar:?]
>       at org.apache.storm.hive.common.HiveWriter.<init>(HiveWriter.java:72) 
> ~[stormjar.jar:?]
>       ... 13 more
> Caused by: org.apache.hive.hcatalog.streaming.TransactionError: Unable to 
> acquire lock on { }
>       at 
> org.apache.hive.hcatalog.streaming.HiveEndPoint$TransactionBatchImpl.beginNextTransactionImpl(HiveEndPoint.java:575)
>  ~[stormjar.jar:?]
>       at 
> org.apache.hive.hcatalog.streaming.HiveEndPoint$TransactionBatchImpl.beginNextTransaction(HiveEndPoint.java:544)
>  ~[stormjar.jar:?]
>       at 
> org.apache.storm.hive.common.HiveWriter.nextTxnBatch(HiveWriter.java:259) 
> ~[stormjar.jar:?]
>       at org.apache.storm.hive.common.HiveWriter.<init>(HiveWriter.java:72) 
> ~[stormjar.jar:?]
>       ... 13 more
> Caused by: org.apache.thrift.transport.TTransportException
>       at 
> org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
>  ~[stormjar.jar:?]
>       at org.apache.thrift.transport.TTransport.readAll(TTransport.java:84) 
> ~[stormjar.jar:?]
>       at 
> org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:378) 
> ~[stormjar.jar:?]
>       at 
> org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:297) 
> ~[stormjar.jar:?]
>       at 
> org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:204)
>  ~[stormjar.jar:?]
>       at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69) 
> ~[stormjar.jar:?]
>       at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_lock(ThriftHiveMetastore.java:3781)
>  ~[stormjar.jar:?]
>       at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.lock(ThriftHiveMetastore.java:3768)
>  ~[stormjar.jar:?]
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.lock(HiveMetaStoreClient.java:1736)
>  ~[stormjar.jar:?]
>       at 
> org.apache.hive.hcatalog.streaming.HiveEndPoint$TransactionBatchImpl.beginNextTransactionImpl(HiveEndPoint.java:570)
>  ~[stormjar.jar:?]
>       at 
> org.apache.hive.hcatalog.streaming.HiveEndPoint$TransactionBatchImpl.beginNextTransaction(HiveEndPoint.java:544)
>  ~[stormjar.jar:?]
>       at 
> org.apache.storm.hive.common.HiveWriter.nextTxnBatch(HiveWriter.java:259) 
> ~[stormjar.jar:?]
>       at org.apache.storm.hive.common.HiveWriter.<init>(HiveWriter.java:72) 
> ~[stormjar.jar:?]
>       ... 13 more
>
>
>

Reply via email to