The post you mention has a comment in there suggesting perhaps the OS
user running NiFi/Java might somehow be restricted from obtaining its
own IP address, could that be an issue?  Alternatively, if it cannot
get a SessionState for some reason, I would presume there would be
error(s) in the hive metastore log?

I have seen this error with the Atlas hook, and evidently it also
happens with the metastore hook.  Are you overriding the metastore
AuthorizationManager? If so the errors might be red herrings [1].  Are
you using multiple concurrent tasks for the PutHiveStreaming
processor? It's possible you might be running into [2].

Regards,
Matt

[1] https://issues.apache.org/jira/browse/HIVE-11190
[2] https://issues.apache.org/jira/browse/HIVE-12409


On Mon, Feb 19, 2018 at 8:35 AM, Michal Tomaszewski
<michal.tomaszew...@cca.pl> wrote:
> Hi Matt,
> Thank you for reply.
> NiFi is compiled exactly like you mentioned. Otherwise we could not use 
> PutHiveQL processors.
>
> We compiled nifi 1.6 snapshot using exactly:
> mvn -T C2.0 clean install -Phortonworks -Dhive.version=1.2.1000.2.6.4.0-91 
> -Dhive.hadoop.version=2.7.3.2.6.4.0-91 -Dhadoop.version=2.7.3.2.6.4.0-91 
> -DskipTests –e
>
>
> Regards,
> Mike
>
>> -----Original Message-----
>> From: Matt Burgess [mailto:mattyb...@apache.org]
>> Sent: Monday, February 19, 2018 2:30 PM
>> To: users@nifi.apache.org
>> Subject: Re: PutHiveStreaming NullPointerException error
>>
>> Mike,
>>
>> Joe is correct, in order for Apache NiFi to interact with HDP Hive, the Hive
>> client dependencies need to be swapped out, as HDP Hive 1.x components
>> are not 100% compatible with Apache Hive 1.x components.
>> This can be done (in general) while building NiFi with Maven, by using a
>> vendor profile and overriding the "hive.version" and "hive.hadoop.version"
>> properties. There are currently 3 vendor profiles (hortonworks, cloudera,
>> mapr), and there are examples of how to override the properties in the top-
>> level pom.xml (in the vendor profile section). An example to build with HDP
>> Hive components is:
>>
>> mvn clean install -Phortonworks -Dhive.version=1.2.1000.2.6.0.3-8
>> -Dhive.hadoop.version=2.7.3.2.6.0.3-8
>>
>> You should be able to use this in the
>> nifi-nar-bundles/nifi-hive-bundle directory to only build the Hive NAR(s),
>> rather than a full rebuild. Then you can replace the Apache NiFi versions of
>> the Hive NARs in your distribution with these vendor-specific ones.
>>
>> Regards,
>> Matt
>>
>> On Mon, Feb 19, 2018 at 8:21 AM, Joe Witt <joe.w...@gmail.com> wrote:
>> > Mike - ah i see . Thanks for clarifying.
>> >
>> > I think the issue is that to interact properly with the Hive version
>> > in HDP we have to swap out some dependencies at build time.
>> >
>> > Mattyb is definitely more knowledgeable so hopefully he can comment
>> soon.
>> >
>> > On Mon, Feb 19, 2018 at 8:06 AM, Michal Tomaszewski
>> > <michal.tomaszew...@cca.pl> wrote:
>> >> Hi Joe,
>> >> Thanks for prompt answer.
>> >> As I wrote - I'm currently using NiFi 1.6 snapshot compiled from latest
>> community sources https://github.com/apache/nifi.
>> >> I don't use Hortonworks' NiFi version. I verified Hortonworks' NiFi 
>> >> version
>> only in order to be sure the problem is not connected with my compilation,
>> as NiFi sends data to hive cluster on hortonworks.
>> >>
>> >> The problem is confirmed on community version of NiFi 1.6 (current
>> snapshot), 1.5, 1.5 pre-RC and 1.4.
>> >>
>> >> Regards,
>> >> Mike
>> >>
>> >>
>> >>
>> >>
>> >>> -----Original Message-----
>> >>> From: Joe Witt [mailto:joe.w...@gmail.com]
>> >>> Sent: Monday, February 19, 2018 1:53 PM
>> >>> To: users@nifi.apache.org
>> >>> Subject: Re: PutHiveStreaming NullPointerException error
>> >>>
>> >>> Mike,
>> >>>
>> >>> Dev is fine but I think part of the difficultly for this group here
>> >>> is you're referring to a vendor distribution the bundles apache nifi.
>> >>> The libraries involved are different than what we build/provide
>> >>> directly.  If you can recreate the same problem using apache nifi as
>> >>> we provide it as a community it would be easier for someone to help
>> >>> here and otherwise you might want to reach out to the vendor for help
>> with that configuration.
>> >>>
>> >>> Thanks
>> >>>
>> >>> On Mon, Feb 19, 2018 at 6:13 AM, Michal Tomaszewski
>> >>> <michal.tomaszew...@cca.pl> wrote:
>> >>> > Hi Team,
>> >>> >
>> >>> > Should I send this question to NiFi dev list instead of NiFi users?
>> >>> >
>> >>> > Regards,
>> >>> > Mike
>> >>> >
>> >>> >>> 2018-02-15 17:42:29,901 ERROR [Timer-Driven Process Thread-11]
>> >>> hive.log Got exception: java.lang.NullPointerException null
>> >>> java.lang.NullPointerException: null
>> >>> >>>        at
>> >>> org.apache.hadoop.hive.ql.security.authorization.plugin.Authorizatio
>> >>> nMetaS
>> >>> toreFilterHook.getFilteredObjects(AuthorizationMetaStoreFilterHook.j
>> >>> ava:7
>> >>> 7)
>> >>> >>>        at
>> >>> org.apache.hadoop.hive.ql.security.authorization.plugin.Authorizatio
>> >>> nMetaS
>> >>> toreFilterHook.filterDatabases(AuthorizationMetaStoreFilterHook.java
>> >>> :54)
>> >>> >>>        at
>> >>>
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabases(Hi
>> >>> ve
>> >>> MetaStoreClient.java:1116)
>> >>> >>>        at
>> >>>
>> org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaSto
>> >>> r
>> >>> eClient.isOpen(HiveClientCache.java:469)
>> >>> >>>        at sun.reflect.GeneratedMethodAccessor111.invoke(Unknown
>> >>> Source)
>> >>> >>>        at
>> >>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcce
>> >>> s
>> >>> sorImpl.java:43)
>> >>> >>>        at java.lang.reflect.Method.invoke(Method.java:498)
>> >>> >>>        at
>> >>>
>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(Retr
>> >>> yin
>> >>> gMetaStoreClient.java:174)
>> >>> >>>        at com.sun.proxy.$Proxy341.isOpen(Unknown Source)
>> >>> >>>        at
>> >>>
>> org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.
>> >>> jav
>> >>> a:269)
>> >>> >>>        at
>> >>>
>> org.apache.hive.hcatalog.common.HCatUtil.getHiveMetastoreClient(HCat
>> >>> Uti
>> >>> l.java:558)
>> >>> >>>        at
>> >>> org.apache.hive.hcatalog.streaming.AbstractRecordWriter.<init>(Abstr
>> >>> actRe
>> >>> cordWriter.java:94)
>> >>> >>>        at
>> >>> org.apache.hive.hcatalog.streaming.StrictJsonWriter.<init>(StrictJso
>> >>> nWriter.j
>> >>> ava:82)
>> >>> >>>        at
>> >>> org.apache.hive.hcatalog.streaming.StrictJsonWriter.<init>(StrictJso
>> >>> nWriter.j
>> >>> ava:60)
>> >>> >>>        at
>> >>> org.apache.nifi.util.hive.HiveWriter.getRecordWriter(HiveWriter.java
>> >>> :85)
>> >>> >>>        at 
>> >>> >>> org.apache.nifi.util.hive.HiveWriter.<init>(HiveWriter.java:72)
>> >>> >>>        at
>> >>> org.apache.nifi.util.hive.HiveUtils.makeHiveWriter(HiveUtils.java:46
>> >>> )
>> >>> >>>        at
>> >>>
>> org.apache.nifi.processors.hive.PutHiveStreaming.makeHiveWriter(PutH
>> >>> iveS
>> >>> treaming.java:1036)
>> >>> >>>        at
>> >>> org.apache.nifi.processors.hive.PutHiveStreaming.getOrCreateWriter(P
>> >>> utHiv
>> >>> eStreaming.java:947)
>> >>> >>>        at
>> >>> org.apache.nifi.processors.hive.PutHiveStreaming.lambda$null$8(PutHi
>> >>> veStr
>> >>> eaming.java:743)
>> >>> >>>        at
>> >>> org.apache.nifi.processor.util.pattern.ExceptionHandler.execute(Exce
>> >>> ptionH
>> >>> andler.java:127)
>> >>> >>>        at
>> >>> org.apache.nifi.processors.hive.PutHiveStreaming.lambda$onTrigger$12
>> >>> (Put
>> >>> HiveStreaming.java:740)
>> >>> >>>        at
>> >>> org.apache.nifi.controller.repository.StandardProcessSession.read(St
>> >>> andard
>> >>> ProcessSession.java:2175)
>> >>> >>>        at
>> >>> org.apache.nifi.controller.repository.StandardProcessSession.read(St
>> >>> andard
>> >>> ProcessSession.java:2145)
>> >>> >>>        at
>> >>> org.apache.nifi.processors.hive.PutHiveStreaming.onTrigger(PutHiveSt
>> >>> reami
>> >>> ng.java:694)
>> >>> >>>        at
>> >>> org.apache.nifi.processors.hive.PutHiveStreaming.lambda$onTrigger$4(
>> >>> PutH
>> >>> iveStreaming.java:572)
>> >>> >>>        at
>> >>> org.apache.nifi.processor.util.pattern.PartialFunctions.onTrigger(Pa
>> >>> rtialFunct
>> >>> ions.java:114)
>> >>> >>>        at
>> >>> org.apache.nifi.processor.util.pattern.RollbackOnFailure.onTrigger(R
>> >>> ollbackO
>> >>> nFailure.java:184)
>> >>> >>>        at
>> >>> org.apache.nifi.processors.hive.PutHiveStreaming.onTrigger(PutHiveSt
>> >>> reami
>> >>> ng.java:572)
>> >>> >>>        at
>> >>> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardP
>> >>> roces
>> >>> sorNode.java:1122)
>> >>> >>>        at
>> >>> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(Co
>> >>> ntinually
>> >>> RunProcessorTask.java:147)
>> >>> >>>        at
>> >>> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(Co
>> >>> ntinually
>> >>> RunProcessorTask.java:47)
>> >>> >>>        at
>> >>> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.r
>> >>> un(Ti
>> >>> merDrivenSchedulingAgent.java:128)
>> >>> >>>        at
>> >>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:5
>> >>> 11)
>> >>> >>>        at
>> >>> java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
>> >>> >>>        at
>> >>>
>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask
>> >>> .ac
>> >>> cess$301(ScheduledThreadPoolExecutor.java:180)
>> >>> >>>        at
>> >>>
>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask
>> >>> .ru
>> >>> n(ScheduledThreadPoolExecutor.java:294)
>> >>> >>>        at
>> >>>
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor
>> >>> .jav
>> >>> a:1149)
>> >>> >>>        at
>> >>>
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecuto
>> >>> r.ja
>> >>> va:624)
>> >>> >>>        at java.lang.Thread.run(Thread.java:748)
> ________________________________________ Uwaga: Treść niniejszej wiadomości 
> może być poufna i objęta zakazem jej ujawniania. Jeśli czytelnik tej 
> wiadomości nie jest jej zamierzonym adresatem, pracownikiem lub pośrednikiem 
> upoważnionym do jej przekazania adresatowi, informujemy że wszelkie 
> rozprowadzanie, rozpowszechnianie lub powielanie niniejszej wiadomości jest 
> zabronione. Jeśli otrzymałeś tę wiadomość omyłkowo, proszę bezzwłocznie 
> odesłać ją nadawcy, a samą wiadomość usunąć z komputera. Dziękujemy. 
> ________________________________ Note: The information contained in this 
> message may be privileged and confidential and protected from disclosure. If 
> the reader of this message is not the intended recipient, or an employee or 
> agent responsible for delivering this message to the intended recipient, you 
> are hereby notified that any dissemination, distribution or copying of this 
> communication is strictly prohibited.If you have received this communication 
> in error, please notify the sender immediately by replying to the message and 
> deleting it from your computer. Thank you. ________________________________

Reply via email to