Hi all,
I am trying to ingest data to hive using hive streaming processor via apache nifi. It is working fine for unpartitioned table and also for the existing partition. When I'm trying to ingest to the partitioned table, it throws the following error:- 2017-10-26 18:13:15,312 ERROR [Timer-Driven Process Thread-3] o.a.n.processors.hive.PutHiveStreaming PutHiveStreaming[id=80ee3b56-68a5-13e1-3251-9cc44bebddf4] Failed to create HiveWriter for endpoint: {metaStoreUri='thrift://localhost:9083', database='test', table='dpi_pt', partitionVals=[358347590] }: org.apache.nifi.util.hive.HiveWriter$ConnectFailure: Failed connecting to EndPoint {metaStoreUri='thrift://localhost:9083', database='test', table='dpi_pt', partitionVals=[358347590] } org.apache.nifi.util.hive.HiveWriter$ConnectFailure: Failed connecting to EndPoint {metaStoreUri='thrift://localhost:9083', database='test', table='dpi_pt', partitionVals=[358347590] } at org.apache.nifi.util.hive.HiveWriter.<init>(HiveWriter.java:79) at org.apache.nifi.util.hive.HiveUtils.makeHiveWriter(HiveUtils.java:46) at org.apache.nifi.processors.hive.PutHiveStreaming.makeHiveWriter(PutHiveStrea ming.java:968) at org.apache.nifi.processors.hive.PutHiveStreaming.getOrCreateWriter(PutHiveSt reaming.java:879) at org.apache.nifi.processors.hive.PutHiveStreaming.lambda$null$8(PutHiveStream ing.java:680) at org.apache.nifi.processor.util.pattern.ExceptionHandler.execute(ExceptionHan dler.java:127) at org.apache.nifi.processors.hive.PutHiveStreaming.lambda$onTrigger$12(PutHive Streaming.java:677) at org.apache.nifi.controller.repository.StandardProcessSession.read(StandardPr ocessSession.java:2174) at org.apache.nifi.controller.repository.StandardProcessSession.read(StandardPr ocessSession.java:2144) at org.apache.nifi.processors.hive.PutHiveStreaming.onTrigger(PutHiveStreaming. java:631) at org.apache.nifi.processors.hive.PutHiveStreaming.lambda$onTrigger$4(PutHiveS treaming.java:555) at org.apache.nifi.processor.util.pattern.PartialFunctions.onTrigger(PartialFun ctions.java:114) at org.apache.nifi.processor.util.pattern.RollbackOnFailure.onTrigger(RollbackO nFailure.java:184) at org.apache.nifi.processors.hive.PutHiveStreaming.onTrigger(PutHiveStreaming. java:555) at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessor Node.java:1119) at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(Continuall yRunProcessorTask.java:147) at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(Continuall yRunProcessorTask.java:47) at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(Timer DrivenSchedulingAgent.java:128) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$ 301(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Sch eduledThreadPoolExecutor.java:294) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:11 49) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:6 24) at java.lang.Thread.run(Thread.java:748) Caused by: org.apache.hive.hcatalog.streaming.StreamingException: partition values=[358347590]. Unable to get path for end point: [358347590] at org.apache.hive.hcatalog.streaming.AbstractRecordWriter.getPathForEndPoint(A bstractRecordWriter.java:166) at org.apache.hive.hcatalog.streaming.AbstractRecordWriter.<init>(AbstractRecor dWriter.java:68) at org.apache.hive.hcatalog.streaming.StrictJsonWriter.<init>(StrictJsonWriter. java:62) at org.apache.nifi.util.hive.HiveWriter.getRecordWriter(HiveWriter.java:85) at org.apache.nifi.util.hive.HiveWriter.<init>(HiveWriter.java:72) ... 24 common frames omitted Caused by: org.apache.hadoop.hive.metastore.api.NoSuchObjectException: partition values=[358347590] at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_partition_resul t$get_partition_resultStandardScheme.read(ThriftHiveMetastore.java:59058) at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_partition_resul t$get_partition_resultStandardScheme.read(ThriftHiveMetastore.java:59026) at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_partition_resul t.read(ThriftHiveMetastore.java:58957) at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78) at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_par tition(ThriftHiveMetastore.java:1860) at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_partitio n(ThriftHiveMetastore.java:1845) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getPartition(HiveMetaSt oreClient.java:1174) at sun.reflect.GeneratedMethodAccessor439.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl .java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMeta StoreClient.java:152) at com.sun.proxy.$Proxy145.getPartition(Unknown Source) at org.apache.hive.hcatalog.streaming.AbstractRecordWriter.getPathForEndPoint(A bstractRecordWriter.java:161) ... 28 common frames omitted Kindly help. Regards, Mohit Jain