Hi All,

 

I am trying to write data to dynamic partitions in Hive using
PutHiveStreaming Processor. 

It fails in case of auto partitioning. Though it is working fine if
partition is already created. 

 

Nifi throws the following error in case of auto partitioning:-

 

12:48:14 ISTERROR80ee3b56-68a5-13e1-3251-9cc44bebddf4

PutHiveStreaming[id=80ee3b56-68a5-13e1-3251-9cc44bebddf4] Failed to create
HiveWriter for endpoint: {metaStoreUri='thrift://localhost:9083',
database='test', table='jsondpi3', partitionVals=[356945013] }:
org.apache.nifi.util.hive.HiveWriter$ConnectFailure: Failed connecting to
EndPoint {metaStoreUri='thrift://localhost:9083', database='test',
table='jsondpi3', partitionVals=[356945013] }

12:48:14 ISTERROR80ee3b56-68a5-13e1-3251-9cc44bebddf4

PutHiveStreaming[id=80ee3b56-68a5-13e1-3251-9cc44bebddf4] Error connecting
to Hive endpoint: table jsondpi3 at thrift://localhost:9083

12:48:14 ISTERROR80ee3b56-68a5-13e1-3251-9cc44bebddf4

PutHiveStreaming[id=80ee3b56-68a5-13e1-3251-9cc44bebddf4] Hive Streaming
connect/write error, flow file will be penalized and routed to retry.
org.apache.nifi.util.hive.HiveWriter$ConnectFailure: Failed connecting to
EndPoint {metaStoreUri='thrift://localhost:9083', database='test',
table='jsondpi3', partitionVals=[356945013] }:
org.apache.nifi.processors.hive.PutHiveStreaming$ShouldRetryException: Hive
Streaming connect/write error, flow file will be penalized and routed to
retry. org.apache.nifi.util.hive.HiveWriter$ConnectFailure: Failed
connecting to EndPoint {metaStoreUri='thrift://localhost:9083',
database='test', table='jsondpi3', partitionVals=[356945013] }

 

Nifi app log shows the following log :

 

org.apache.nifi.processors.hive.PutHiveStreaming$ShouldRetryException: Hive
Streaming connect/write error, flow file will be penalized and routed to
retry. org.apache.nifi.util.hive.HiveWriter$ConnectFailure: Failed
connecting to EndPoint {metaStoreUri='thrift://localhost:9083',
database='test', table='jsondpi3', partitionVals=[356945013] }

               at
org.apache.nifi.processors.hive.PutHiveStreaming.lambda$onHiveRecordsError$1
(PutHiveStreaming.java:527)

               at
org.apache.nifi.processor.util.pattern.ExceptionHandler$OnError.lambda$andTh
en$0(ExceptionHandler.java:54)

               at
org.apache.nifi.processors.hive.PutHiveStreaming.lambda$onHiveRecordError$2(
PutHiveStreaming.java:545)

               at
org.apache.nifi.processor.util.pattern.ExceptionHandler.execute(ExceptionHan
dler.java:148)

               at
org.apache.nifi.processors.hive.PutHiveStreaming.lambda$onTrigger$12(PutHive
Streaming.java:677)

               at
org.apache.nifi.controller.repository.StandardProcessSession.read(StandardPr
ocessSession.java:2174)

               at
org.apache.nifi.controller.repository.StandardProcessSession.read(StandardPr
ocessSession.java:2144)

               at
org.apache.nifi.processors.hive.PutHiveStreaming.onTrigger(PutHiveStreaming.
java:631)

               at
org.apache.nifi.processors.hive.PutHiveStreaming.lambda$onTrigger$4(PutHiveS
treaming.java:555)

               at
org.apache.nifi.processor.util.pattern.PartialFunctions.onTrigger(PartialFun
ctions.java:114)

               at
org.apache.nifi.processor.util.pattern.RollbackOnFailure.onTrigger(RollbackO
nFailure.java:184)

               at
org.apache.nifi.processors.hive.PutHiveStreaming.onTrigger(PutHiveStreaming.
java:555)

               at
org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessor
Node.java:1119)

               at
org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(Continuall
yRunProcessorTask.java:147)

               at
org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(Continuall
yRunProcessorTask.java:47)

               at
org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(Timer
DrivenSchedulingAgent.java:128)

               at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)

               at
java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)

               at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$
301(ScheduledThreadPoolExecutor.java:180)

               at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Sch
eduledThreadPoolExecutor.java:294)

               at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:11
49)

               at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:6
24)

               at java.lang.Thread.run(Thread.java:748)

 

 

Error in detail is attached. 

Property details Is also attached.

 

Please let me know what I am doing wrong.

 

Regards,

Mohit Jain

2017-10-25 12:48:14,216 ERROR [Timer-Driven Process Thread-6] 
o.a.n.processors.hive.PutHiveStreaming 
PutHiveStreaming[id=80ee3b56-68a5-13e1-3251-9cc44bebddf4] Error connecting to 
Hive endpoint: table jsondpi3 at thrift://localhost:9083
2017-10-25 12:48:14,216 ERROR [Timer-Driven Process Thread-6] 
o.a.n.processors.hive.PutHiveStreaming 
PutHiveStreaming[id=80ee3b56-68a5-13e1-3251-9cc44bebddf4] Hive Streaming 
connect/write error, flow file will be penalized and routed to retry. 
org.apache.nifi.util.hive.HiveWriter$ConnectFailure: Failed connecting to 
EndPoint {metaStoreUri='thrift://localhost:9083', database='test', 
table='jsondpi3', partitionVals=[356945013] }: 
org.apache.nifi.processors.hive.PutHiveStreaming$ShouldRetryException: Hive 
Streaming connect/write error, flow file will be penalized and routed to retry. 
org.apache.nifi.util.hive.HiveWriter$ConnectFailure: Failed connecting to 
EndPoint {metaStoreUri='thrift://localhost:9083', database='test', 
table='jsondpi3', partitionVals=[356945013] }
org.apache.nifi.processors.hive.PutHiveStreaming$ShouldRetryException: Hive 
Streaming connect/write error, flow file will be penalized and routed to retry. 
org.apache.nifi.util.hive.HiveWriter$ConnectFailure: Failed connecting to 
EndPoint {metaStoreUri='thrift://localhost:9083', database='test', 
table='jsondpi3', partitionVals=[356945013] }
        at 
org.apache.nifi.processors.hive.PutHiveStreaming.lambda$onHiveRecordsError$1(PutHiveStreaming.java:527)
        at 
org.apache.nifi.processor.util.pattern.ExceptionHandler$OnError.lambda$andThen$0(ExceptionHandler.java:54)
        at 
org.apache.nifi.processors.hive.PutHiveStreaming.lambda$onHiveRecordError$2(PutHiveStreaming.java:545)
        at 
org.apache.nifi.processor.util.pattern.ExceptionHandler.execute(ExceptionHandler.java:148)
        at 
org.apache.nifi.processors.hive.PutHiveStreaming.lambda$onTrigger$12(PutHiveStreaming.java:677)
        at 
org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:2174)
        at 
org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:2144)
        at 
org.apache.nifi.processors.hive.PutHiveStreaming.onTrigger(PutHiveStreaming.java:631)
        at 
org.apache.nifi.processors.hive.PutHiveStreaming.lambda$onTrigger$4(PutHiveStreaming.java:555)
        at 
org.apache.nifi.processor.util.pattern.PartialFunctions.onTrigger(PartialFunctions.java:114)
        at 
org.apache.nifi.processor.util.pattern.RollbackOnFailure.onTrigger(RollbackOnFailure.java:184)
        at 
org.apache.nifi.processors.hive.PutHiveStreaming.onTrigger(PutHiveStreaming.java:555)
        at 
org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1119)
        at 
org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147)
        at 
org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
        at 
org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:128)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
        at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
        at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.nifi.util.hive.HiveWriter$ConnectFailure: Failed 
connecting to EndPoint {metaStoreUri='thrift://localhost:9083', 
database='test', table='jsondpi3', partitionVals=[356945013] }
        at org.apache.nifi.util.hive.HiveWriter.<init>(HiveWriter.java:79)
        at org.apache.nifi.util.hive.HiveUtils.makeHiveWriter(HiveUtils.java:46)
        at 
org.apache.nifi.processors.hive.PutHiveStreaming.makeHiveWriter(PutHiveStreaming.java:968)
        at 
org.apache.nifi.processors.hive.PutHiveStreaming.getOrCreateWriter(PutHiveStreaming.java:879)
        at 
org.apache.nifi.processors.hive.PutHiveStreaming.lambda$null$8(PutHiveStreaming.java:680)
        at 
org.apache.nifi.processor.util.pattern.ExceptionHandler.execute(ExceptionHandler.java:127)
        ... 19 common frames omitted
Caused by: org.apache.hive.hcatalog.streaming.StreamingException: partition 
values=[356945013]. Unable to get path for end point: [356945013]
        at 
org.apache.hive.hcatalog.streaming.AbstractRecordWriter.getPathForEndPoint(AbstractRecordWriter.java:166)
        at 
org.apache.hive.hcatalog.streaming.AbstractRecordWriter.<init>(AbstractRecordWriter.java:68)
        at 
org.apache.hive.hcatalog.streaming.StrictJsonWriter.<init>(StrictJsonWriter.java:62)
        at 
org.apache.nifi.util.hive.HiveWriter.getRecordWriter(HiveWriter.java:85)
        at org.apache.nifi.util.hive.HiveWriter.<init>(HiveWriter.java:72)
        ... 24 common frames omitted
Caused by: org.apache.hadoop.hive.metastore.api.NoSuchObjectException: 
partition values=[356945013]
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_partition_result$get_partition_resultStandardScheme.read(ThriftHiveMetastore.java:59058)
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_partition_result$get_partition_resultStandardScheme.read(ThriftHiveMetastore.java:59026)
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_partition_result.read(ThriftHiveMetastore.java:58957)
        at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_partition(ThriftHiveMetastore.java:1860)
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_partition(ThriftHiveMetastore.java:1845)
        at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getPartition(HiveMetaStoreClient.java:1174)
        at sun.reflect.GeneratedMethodAccessor402.invoke(Unknown Source)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:152)
        at com.sun.proxy.$Proxy144.getPartition(Unknown Source)
        at 
org.apache.hive.hcatalog.streaming.AbstractRecordWriter.getPathForEndPoint(AbstractRecordWriter.java:161)
        ... 28 common frames omitted

Reply via email to