Re: PutHiveStreaming NullPointerException error

2018-04-21 Thread Shawn Weeks
Realize this is a bit late but I ran into you're emails when I encountered the 
same issue on a my installation of NiFi. I don't know what the too many users 
message is about but I do know why NiFi gets a null pointer exception. The 
PutHiveStreaming Processor uses the HiveEndpoint Class to communicate with Hive 
and when you tell it to auto create partitions it creates a new instance of 
CliSessionState an extended version of SessionState. This can be seen on line 
458 at 
https://github.com/apache/hive/blob/master/hcatalog/streaming/src/java/org/apache/hive/hcatalog/streaming/HiveEndPoint.java.
 Unfortunately CliSessionState doesn't set the ip address. If you look at the 
line in AuthorizationMetaStoreFilterHook mentioned you'll see it's calling 
getUserIpAddress on a SessionState instance. Since the partition is actually 
created that error must be occurring afterwards. I'm assuming this a bug but 
none of this code has been changed in a really long time.
I'm going to try building the Hive Streaming example outside of NiFi and see if 
it works at all.
Thanks
Shawn

On 2018/02/15 17:44:10, Michal Tomaszewski  wrote:
> Hi All,>
>
> PutHiveStreaming throws lots of errors like bellow:>
>
> ERROR [Timer-Driven Process Thread-11] hive.log Got exception: 
> java.lang.NullPointerException null>
> java.lang.NullPointerException: null>
>
> Errors appear when PutHiveStreaming is working. It appears from every few 
> seconds to few times a second.>
>
> Hive streaming writes data to database but is extremely slow (like 2MB/5mins 
> per 6core/24GB NiFi node).>
> Tested on NiFi 1.4, 1.5 and current 1.6 snapshot.>
>
> Hadoop/Hive cluster uses Hortonworks HDP 2.6.4 installation in HA mode 
> without any security.>
> NiFi works in a 3 server cluster without security. NiFi compiled with 
> -Phortonworks and proper libraries definition.>

> nifi conf directory contains actual Hadoop configuration files: 
> core-site.xml, hbase-site.xml, hdfs-site.xml, hive-site.xml, yarn-site.xml>

> HiveQL queries in NIFI (both SelectHiveQL and PutHiveQL) are working properly 
> and fast.>
>
> The same error exists when using Hortonworks original 1.5 NiFi compilation 
> installed automatically by Ambari using HDF 3.1 pack.>

>
> nifi-app.log:>
>
> 2018-02-15 17:42:29,889 INFO [put-hive-streaming-0] 
> org.apache.hadoop.hive.ql.hooks.ATSHook Created ATS Hook>
> 2018-02-15 17:42:29,889 INFO [put-hive-streaming-0] 
> org.apache.hadoop.hive.ql.log.PerfLogger  method=PostHook.org.apache.hadoop.hive.ql.hooks.ATSHook 
> from=org.apache.hadoop.hive.ql.Driver>>

> 2018-02-15 17:42:29,890 INFO [put-hive-streaming-0] 
> org.apache.hadoop.hive.ql.log.PerfLogger  method=PostHook.org.apache.hadoop.hive.ql.hooks.ATSHook start=1518712949889 
> end=1518712949890 duration=1 from=org.apache.hadoop.hive.ql.Driver>>

> 2018-02-15 17:42:29,890 INFO [put-hive-streaming-0] 
> org.apache.hadoop.hive.ql.Driver Resetting the caller context to>

> 2018-02-15 17:42:29,890 INFO [put-hive-streaming-0] 
> org.apache.hadoop.hive.ql.log.PerfLogger  start=1518712949845 end=1518712949890 duration=45 
> from=org.apache.hadoop.hive.ql.Driver>>

> 2018-02-15 17:42:29,890 INFO [put-hive-streaming-0] 
> org.apache.hadoop.hive.ql.Driver OK>
> 2018-02-15 17:42:29,890 INFO [put-hive-streaming-0] 
> org.apache.hadoop.hive.ql.log.PerfLogger  from=org.apache.hadoop.hive.ql.Driver>>

> 2018-02-15 17:42:29,890 INFO [put-hive-streaming-0] 
> o.a.hadoop.hive.ql.lockmgr.DbTxnManager Stopped heartbeat for query: 
> nifi_20180215174229_58046dbe-0af7-41ae-94c4-7bb692053d67>

> 2018-02-15 17:42:29,890 INFO [put-hive-streaming-0] 
> o.a.hadoop.hive.ql.lockmgr.DbLockManager releaseLocks: [lockid:5010336 
> queryId=nifi_20180215174229_58046dbe-0af7-41ae-94c4-7bb692053d67 txnid:0]>

> 2018-02-15 17:42:29,894 INFO [put-hive-streaming-0] 
> org.apache.hadoop.hive.ql.log.PerfLogger  start=1518712949890 end=1518712949894 duration=4 
> from=org.apache.hadoop.hive.ql.Driver>>

> 2018-02-15 17:42:29,894 INFO [put-hive-streaming-0] 
> org.apache.hadoop.hive.ql.log.PerfLogger  start=1518712949819 end=1518712949894 duration=75 
> from=org.apache.hadoop.hive.ql.Driver>>

> 2018-02-15 17:42:29,898 WARN [Timer-Driven Process Thread-11] hive.metastore 
> Unexpected increment of user count beyond one: 2 HCatClient: thread: 132 
> users=2 expired=false closed=false>

> 2018-02-15 17:42:29,901 ERROR [Timer-Driven Process Thread-11] hive.log Got 
> exception: java.lang.NullPointerException null>

> java.lang.NullPointerException: null>
> at 
> org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook.getFilteredObjects(AuthorizationMetaStoreFilterHook.java:77)>

> at 
> org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook.filterDatabases(AuthorizationMetaStoreFilterHook.java:54)>

> at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabases(HiveMetaStoreClient.java:1116)>
> at 
> org.apache.hive.hcatalog

RE: PutHiveStreaming NullPointerException error

2018-02-20 Thread Michal Tomaszewski
Hi Matt,

Regarding sent information I have also additional questions:

- Are there any other tests I can perform to verify the source of the problem?

- Do you have any idea what means/ what is the cause of the error
ERROR [NiFi logging handler] org.apache.nifi.StdErr OK
in NiFi bootstrap log and how to get rid of it?

- Do you know what is the source of warning:
WARN [Timer-Driven Process Thread-11] hive.metastore Unexpected increment of 
user count beyond one: 2 HCatClient: thread: 132 users=2 expired=false 
closed=false
even in the case there is one PutHiveStreaming working as a single task on 
primary node only and max open connections property is set to 1?
It seems like PutHiveStreaming is not closing previous connections properly...

- Is there any method to set username/password that PutHiveStreaming is using 
to connect to Hive?

Regards,
Mike

> -Original Message-
> From: Michal Tomaszewski
> Sent: Monday, February 19, 2018 4:10 PM
> To: users@nifi.apache.org
> Subject: RE: PutHiveStreaming NullPointerException error
>
> Matt,
>
> Regarding all mentioned points:
>
> > The post you mention has a comment in there suggesting perhaps the OS
> > user running NiFi/Java might somehow be restricted from obtaining its
> > own IP address, could that be an issue?  Alternatively, if it cannot
> > get a SessionState for some reason, I would presume there would be
> > error(s) in the hive metastore log?
>
> a/ I think it is not possible. nifi is run under user "nifi", hadoop group, 
> without
> any limits. NiFi was installed according to community documentation for
> Ubuntu server.
> To be 100% sure we tested starting nifi on "hive:hadoop" account - no
> change We tested change hive server properties to run all queries on "hive"
> user (hive.server2.enable.doAs) - no change
>
> b/ in logs:
>
> hivemetastore.log shows only warnings (and it seems to be connected to nifi
> streaming):
> 2018-02-19 00:45:06,492 WARN  [pool-5-thread-224]: conf.HiveConf
> (HiveConf.java:initialize(3093)) - HiveConf of name hive.log.dir does not 
> exist
> 2018-02-19 00:45:06,492 WARN  [pool-5-thread-224]: conf.HiveConf
> (HiveConf.java:initialize(3093)) - HiveConf of name hive.log.file does not 
> exist
> 2018-02-19 00:45:06,563 WARN  [pool-5-thread-224]: conf.HiveConf
> (HiveConf.java:initialize(3093)) - HiveConf of name hive.log.dir does not 
> exist
> 2018-02-19 00:45:06,563 WARN  [pool-5-thread-224]: conf.HiveConf
> (HiveConf.java:initialize(3093)) - HiveConf of name hive.log.file does not 
> exist
> 2018-02-19 00:45:08,399 WARN  [pool-5-thread-163]: conf.HiveConf
> (HiveConf.java:initialize(3093)) - HiveConf of name hive.log.dir does not 
> exist
> 2018-02-19 00:45:08,399 WARN  [pool-5-thread-163]: conf.HiveConf
> (HiveConf.java:initialize(3093)) - HiveConf of name hive.log.file does not 
> exist
> 2018-02-19 00:45:08,468 WARN  [pool-5-thread-163]: conf.HiveConf
> (HiveConf.java:initialize(3093)) - HiveConf of name hive.log.dir does not 
> exist
> 2018-02-19 00:45:08,468 WARN  [pool-5-thread-163]: conf.HiveConf
> (HiveConf.java:initialize(3093)) - HiveConf of name hive.log.file does not 
> exist
> 2018-02-19 00:45:08,919 WARN  [pool-5-thread-324]: conf.HiveConf
> (HiveConf.java:initialize(3093)) - HiveConf of name hive.log.dir does not 
> exist
> 2018-02-19 00:45:08,919 WARN  [pool-5-thread-324]: conf.HiveConf
> (HiveConf.java:initialize(3093)) - HiveConf of name hive.log.file does not 
> exist
> 2018-02-19 00:45:08,989 WARN  [pool-5-thread-324]: conf.HiveConf
> (HiveConf.java:initialize(3093)) - HiveConf of name hive.log.dir does not 
> exist
> 2018-02-19 00:45:08,990 WARN  [pool-5-thread-324]: conf.HiveConf
> (HiveConf.java:initialize(3093)) - HiveConf of name hive.log.file does not 
> exist
>
>
> hiveserver2.log has only such warnings:
>
> 2018-02-19 15:30:58,871 WARN  [HiveServer2-Handler-Pool: Thread-68]:
> conf.HiveConf (HiveConf.java:initialize(3093)) - HiveConf of name
> hive.internal.ss.authz.settings.applied.marker does not exist
> 2018-02-19 15:30:58,883 WARN  [HiveServer2-Handler-Pool: Thread-68]:
> metrics2.CodahaleMetrics (CodahaleMetrics.java:addGauge(299)) - A Gauge
> with name [init_total_count_dbs] already exists.  The old gauge will be
> overwritten, but this is not recommended
> 2018-02-19 15:30:58,884 WARN  [HiveServer2-Handler-Pool: Thread-68]:
> metrics2.CodahaleMetrics (CodahaleMetrics.java:addGauge(299)) - A Gauge
> with name [init_total_count_tables] already exists.  The old gauge will be
> overwritten, but this is not recommended
> 2018-02-19 15:30:58,884 WARN  [HiveServer2-Handler-Pool: Thread-68]:
> metrics2.CodahaleMetrics (CodahaleMetrics.java:addGauge(299)) - A Gauge
> with name [init_total_count_parti

RE: PutHiveStreaming NullPointerException error

2018-02-19 Thread Michal Tomaszewski
R [NiFi logging handler] org.apache.nifi.StdErr OK
2018-02-19 15:39:33,248 ERROR [NiFi logging handler] org.apache.nifi.StdErr OK
2018-02-19 15:39:33,312 ERROR [NiFi logging handler] org.apache.nifi.StdErr OK
2018-02-19 15:40:35,201 ERROR [NiFi logging handler] org.apache.nifi.StdErr OK
2018-02-19 15:40:35,260 ERROR [NiFi logging handler] org.apache.nifi.StdErr OK
2018-02-19 15:41:36,825 ERROR [NiFi logging handler] org.apache.nifi.StdErr OK
2018-02-19 15:41:36,899 ERROR [NiFi logging handler] org.apache.nifi.StdErr OK



>
> I have seen this error with the Atlas hook, and evidently it also happens with
> the metastore hook.  Are you overriding the metastore
> AuthorizationManager? If so the errors might be red herrings [1].
> [1] https://issues.apache.org/jira/browse/HIVE-11190

[1]
We are not overriding anything - default installation with nearly no changes. 
The only fact is that our Hive/Hadoop cluster is working in HA and is unsecured 
- what (I think) is not very common...

> Are you
> using multiple concurrent tasks for the PutHiveStreaming processor? It's
> possible you might be running into [2].
> [2] https://issues.apache.org/jira/browse/HIVE-12409

[2]
In fact we are using HiveStreaming on each node (3 nodes nifi cluster) - so 
they are some way concurrent.
I also thought it can be the cause, so I've tested it more thoughly:
There are 7 PutHiveStreaming components in the flow working on 3 node cluster. 
All are streaming to different databases. All are set to 1 task per node.
To verify concurrency problem I've made a test:
Reconfigured flow to run ONE PutHiiveStreaming component on NiFi Primary Node 
only (the rest of PutHiveStreamings components were stopped during the test), 
setting 1 task and 1 max open connections (in Hive streaming properties).
In running such configuration the error appeared in logs as usual - so I 
believe it is not an issue caused by [2].


Regards,
Mike
>
> On Mon, Feb 19, 2018 at 8:35 AM, Michal Tomaszewski
>  wrote:
> > Hi Matt,
> > Thank you for reply.
> > NiFi is compiled exactly like you mentioned. Otherwise we could not use
> PutHiveQL processors.
> >
> > We compiled nifi 1.6 snapshot using exactly:
> > mvn -T C2.0 clean install -Phortonworks
> > -Dhive.version=1.2.1000.2.6.4.0-91
> > -Dhive.hadoop.version=2.7.3.2.6.4.0-91
> > -Dhadoop.version=2.7.3.2.6.4.0-91 -DskipTests –e
> >
> >
> > Regards,
> > Mike
> >
> >> -Original Message-
> >> From: Matt Burgess [mailto:mattyb...@apache.org]
> >> Sent: Monday, February 19, 2018 2:30 PM
> >> To: users@nifi.apache.org
> >> Subject: Re: PutHiveStreaming NullPointerException error
> >>
> >> Mike,
> >>
> >> Joe is correct, in order for Apache NiFi to interact with HDP Hive,
> >> the Hive client dependencies need to be swapped out, as HDP Hive 1.x
> >> components are not 100% compatible with Apache Hive 1.x components.
> >> This can be done (in general) while building NiFi with Maven, by
> >> using a vendor profile and overriding the "hive.version" and
> "hive.hadoop.version"
> >> properties. There are currently 3 vendor profiles (hortonworks,
> >> cloudera, mapr), and there are examples of how to override the
> >> properties in the top- level pom.xml (in the vendor profile section).
> >> An example to build with HDP Hive components is:
> >>
> >> mvn clean install -Phortonworks -Dhive.version=1.2.1000.2.6.0.3-8
> >> -Dhive.hadoop.version=2.7.3.2.6.0.3-8
> >>
> >> You should be able to use this in the
> >> nifi-nar-bundles/nifi-hive-bundle directory to only build the Hive
> >> NAR(s), rather than a full rebuild. Then you can replace the Apache
> >> NiFi versions of the Hive NARs in your distribution with these vendor-
> specific ones.
> >>
> >> Regards,
> >> Matt
> >>
> >> On Mon, Feb 19, 2018 at 8:21 AM, Joe Witt  wrote:
> >> > Mike - ah i see . Thanks for clarifying.
> >> >
> >> > I think the issue is that to interact properly with the Hive
> >> > version in HDP we have to swap out some dependencies at build time.
> >> >
> >> > Mattyb is definitely more knowledgeable so hopefully he can comment
> >> soon.
> >> >
> >> > On Mon, Feb 19, 2018 at 8:06 AM, Michal Tomaszewski
> >> >  wrote:
> >> >> Hi Joe,
> >> >> Thanks for prompt answer.
> >> >> As I wrote - I'm currently using NiFi 1.6 snapshot compiled from
> >> >> latest
> >> community sources https://github.com/apache/nifi.
> >> >> I don&#

Re: PutHiveStreaming NullPointerException error

2018-02-19 Thread Matt Burgess
The post you mention has a comment in there suggesting perhaps the OS
user running NiFi/Java might somehow be restricted from obtaining its
own IP address, could that be an issue?  Alternatively, if it cannot
get a SessionState for some reason, I would presume there would be
error(s) in the hive metastore log?

I have seen this error with the Atlas hook, and evidently it also
happens with the metastore hook.  Are you overriding the metastore
AuthorizationManager? If so the errors might be red herrings [1].  Are
you using multiple concurrent tasks for the PutHiveStreaming
processor? It's possible you might be running into [2].

Regards,
Matt

[1] https://issues.apache.org/jira/browse/HIVE-11190
[2] https://issues.apache.org/jira/browse/HIVE-12409


On Mon, Feb 19, 2018 at 8:35 AM, Michal Tomaszewski
 wrote:
> Hi Matt,
> Thank you for reply.
> NiFi is compiled exactly like you mentioned. Otherwise we could not use 
> PutHiveQL processors.
>
> We compiled nifi 1.6 snapshot using exactly:
> mvn -T C2.0 clean install -Phortonworks -Dhive.version=1.2.1000.2.6.4.0-91 
> -Dhive.hadoop.version=2.7.3.2.6.4.0-91 -Dhadoop.version=2.7.3.2.6.4.0-91 
> -DskipTests –e
>
>
> Regards,
> Mike
>
>> -Original Message-
>> From: Matt Burgess [mailto:mattyb...@apache.org]
>> Sent: Monday, February 19, 2018 2:30 PM
>> To: users@nifi.apache.org
>> Subject: Re: PutHiveStreaming NullPointerException error
>>
>> Mike,
>>
>> Joe is correct, in order for Apache NiFi to interact with HDP Hive, the Hive
>> client dependencies need to be swapped out, as HDP Hive 1.x components
>> are not 100% compatible with Apache Hive 1.x components.
>> This can be done (in general) while building NiFi with Maven, by using a
>> vendor profile and overriding the "hive.version" and "hive.hadoop.version"
>> properties. There are currently 3 vendor profiles (hortonworks, cloudera,
>> mapr), and there are examples of how to override the properties in the top-
>> level pom.xml (in the vendor profile section). An example to build with HDP
>> Hive components is:
>>
>> mvn clean install -Phortonworks -Dhive.version=1.2.1000.2.6.0.3-8
>> -Dhive.hadoop.version=2.7.3.2.6.0.3-8
>>
>> You should be able to use this in the
>> nifi-nar-bundles/nifi-hive-bundle directory to only build the Hive NAR(s),
>> rather than a full rebuild. Then you can replace the Apache NiFi versions of
>> the Hive NARs in your distribution with these vendor-specific ones.
>>
>> Regards,
>> Matt
>>
>> On Mon, Feb 19, 2018 at 8:21 AM, Joe Witt  wrote:
>> > Mike - ah i see . Thanks for clarifying.
>> >
>> > I think the issue is that to interact properly with the Hive version
>> > in HDP we have to swap out some dependencies at build time.
>> >
>> > Mattyb is definitely more knowledgeable so hopefully he can comment
>> soon.
>> >
>> > On Mon, Feb 19, 2018 at 8:06 AM, Michal Tomaszewski
>> >  wrote:
>> >> Hi Joe,
>> >> Thanks for prompt answer.
>> >> As I wrote - I'm currently using NiFi 1.6 snapshot compiled from latest
>> community sources https://github.com/apache/nifi.
>> >> I don't use Hortonworks' NiFi version. I verified Hortonworks' NiFi 
>> >> version
>> only in order to be sure the problem is not connected with my compilation,
>> as NiFi sends data to hive cluster on hortonworks.
>> >>
>> >> The problem is confirmed on community version of NiFi 1.6 (current
>> snapshot), 1.5, 1.5 pre-RC and 1.4.
>> >>
>> >> Regards,
>> >> Mike
>> >>
>> >>
>> >>
>> >>
>> >>> -Original Message-
>> >>> From: Joe Witt [mailto:joe.w...@gmail.com]
>> >>> Sent: Monday, February 19, 2018 1:53 PM
>> >>> To: users@nifi.apache.org
>> >>> Subject: Re: PutHiveStreaming NullPointerException error
>> >>>
>> >>> Mike,
>> >>>
>> >>> Dev is fine but I think part of the difficultly for this group here
>> >>> is you're referring to a vendor distribution the bundles apache nifi.
>> >>> The libraries involved are different than what we build/provide
>> >>> directly.  If you can recreate the same problem using apache nifi as
>> >>> we provide it as a community it would be easier for someone to help
>> >>> here and otherwise you might want to reach out to the vendor for help
>> with that configuration.
>> >>>
>> &

RE: PutHiveStreaming NullPointerException error

2018-02-19 Thread Michal Tomaszewski
Hi Matt,
Thank you for reply.
NiFi is compiled exactly like you mentioned. Otherwise we could not use 
PutHiveQL processors.

We compiled nifi 1.6 snapshot using exactly:
mvn -T C2.0 clean install -Phortonworks -Dhive.version=1.2.1000.2.6.4.0-91 
-Dhive.hadoop.version=2.7.3.2.6.4.0-91 -Dhadoop.version=2.7.3.2.6.4.0-91 
-DskipTests –e


Regards,
Mike

> -Original Message-
> From: Matt Burgess [mailto:mattyb...@apache.org]
> Sent: Monday, February 19, 2018 2:30 PM
> To: users@nifi.apache.org
> Subject: Re: PutHiveStreaming NullPointerException error
>
> Mike,
>
> Joe is correct, in order for Apache NiFi to interact with HDP Hive, the Hive
> client dependencies need to be swapped out, as HDP Hive 1.x components
> are not 100% compatible with Apache Hive 1.x components.
> This can be done (in general) while building NiFi with Maven, by using a
> vendor profile and overriding the "hive.version" and "hive.hadoop.version"
> properties. There are currently 3 vendor profiles (hortonworks, cloudera,
> mapr), and there are examples of how to override the properties in the top-
> level pom.xml (in the vendor profile section). An example to build with HDP
> Hive components is:
>
> mvn clean install -Phortonworks -Dhive.version=1.2.1000.2.6.0.3-8
> -Dhive.hadoop.version=2.7.3.2.6.0.3-8
>
> You should be able to use this in the
> nifi-nar-bundles/nifi-hive-bundle directory to only build the Hive NAR(s),
> rather than a full rebuild. Then you can replace the Apache NiFi versions of
> the Hive NARs in your distribution with these vendor-specific ones.
>
> Regards,
> Matt
>
> On Mon, Feb 19, 2018 at 8:21 AM, Joe Witt  wrote:
> > Mike - ah i see . Thanks for clarifying.
> >
> > I think the issue is that to interact properly with the Hive version
> > in HDP we have to swap out some dependencies at build time.
> >
> > Mattyb is definitely more knowledgeable so hopefully he can comment
> soon.
> >
> > On Mon, Feb 19, 2018 at 8:06 AM, Michal Tomaszewski
> >  wrote:
> >> Hi Joe,
> >> Thanks for prompt answer.
> >> As I wrote - I'm currently using NiFi 1.6 snapshot compiled from latest
> community sources https://github.com/apache/nifi.
> >> I don't use Hortonworks' NiFi version. I verified Hortonworks' NiFi version
> only in order to be sure the problem is not connected with my compilation,
> as NiFi sends data to hive cluster on hortonworks.
> >>
> >> The problem is confirmed on community version of NiFi 1.6 (current
> snapshot), 1.5, 1.5 pre-RC and 1.4.
> >>
> >> Regards,
> >> Mike
> >>
> >>
> >>
> >>
> >>> -Original Message-
> >>> From: Joe Witt [mailto:joe.w...@gmail.com]
> >>> Sent: Monday, February 19, 2018 1:53 PM
> >>> To: users@nifi.apache.org
> >>> Subject: Re: PutHiveStreaming NullPointerException error
> >>>
> >>> Mike,
> >>>
> >>> Dev is fine but I think part of the difficultly for this group here
> >>> is you're referring to a vendor distribution the bundles apache nifi.
> >>> The libraries involved are different than what we build/provide
> >>> directly.  If you can recreate the same problem using apache nifi as
> >>> we provide it as a community it would be easier for someone to help
> >>> here and otherwise you might want to reach out to the vendor for help
> with that configuration.
> >>>
> >>> Thanks
> >>>
> >>> On Mon, Feb 19, 2018 at 6:13 AM, Michal Tomaszewski
> >>>  wrote:
> >>> > Hi Team,
> >>> >
> >>> > Should I send this question to NiFi dev list instead of NiFi users?
> >>> >
> >>> > Regards,
> >>> > Mike
> >>> >
> >>> >>> 2018-02-15 17:42:29,901 ERROR [Timer-Driven Process Thread-11]
> >>> hive.log Got exception: java.lang.NullPointerException null
> >>> java.lang.NullPointerException: null
> >>> >>>at
> >>> org.apache.hadoop.hive.ql.security.authorization.plugin.Authorizatio
> >>> nMetaS
> >>> toreFilterHook.getFilteredObjects(AuthorizationMetaStoreFilterHook.j
> >>> ava:7
> >>> 7)
> >>> >>>at
> >>> org.apache.hadoop.hive.ql.security.authorization.plugin.Authorizatio
> >>> nMetaS
> >>> toreFilterHook.filterDatabases(AuthorizationMetaStoreFilterHook.java
> >>> :54)
> >>> >>>at
> >>>
> org.ap

RE: PutHiveStreaming NullPointerException error

2018-02-19 Thread Michal Tomaszewski
Thanks Joe!
I hope there will be simple solution of this...
Please note the same error exists also on hortonworks version of nifi...
I can only suspect the problem can connected to authentication process when 
nifi and cluster is unsecured. Unlike HiveQL processors, HiveStreaming has no 
place to insert username and password. BTW: Don't you think such possibility 
would be usefull?

> -Original Message-
> From: Joe Witt [mailto:joe.w...@gmail.com]
> Sent: Monday, February 19, 2018 2:21 PM
> To: users@nifi.apache.org
> Subject: Re: PutHiveStreaming NullPointerException error
>
> Mike - ah i see . Thanks for clarifying.
>
> I think the issue is that to interact properly with the Hive version in HDP we
> have to swap out some dependencies at build time.
>
> Mattyb is definitely more knowledgeable so hopefully he can comment
> soon.
>
> On Mon, Feb 19, 2018 at 8:06 AM, Michal Tomaszewski
>  wrote:
> > Hi Joe,
> > Thanks for prompt answer.
> > As I wrote - I'm currently using NiFi 1.6 snapshot compiled from latest
> community sources https://github.com/apache/nifi.
> > I don't use Hortonworks' NiFi version. I verified Hortonworks' NiFi version
> only in order to be sure the problem is not connected with my compilation,
> as NiFi sends data to hive cluster on hortonworks.
> >
> > The problem is confirmed on community version of NiFi 1.6 (current
> snapshot), 1.5, 1.5 pre-RC and 1.4.
> >
> > Regards,
> > Mike
> >
> >
> >
> >
> >> -Original Message-
> >> From: Joe Witt [mailto:joe.w...@gmail.com]
> >> Sent: Monday, February 19, 2018 1:53 PM
> >> To: users@nifi.apache.org
> >> Subject: Re: PutHiveStreaming NullPointerException error
> >>
> >> Mike,
> >>
> >> Dev is fine but I think part of the difficultly for this group here
> >> is you're referring to a vendor distribution the bundles apache nifi.
> >> The libraries involved are different than what we build/provide
> >> directly.  If you can recreate the same problem using apache nifi as
> >> we provide it as a community it would be easier for someone to help
> >> here and otherwise you might want to reach out to the vendor for help
> with that configuration.
> >>
> >> Thanks
> >>
> >> On Mon, Feb 19, 2018 at 6:13 AM, Michal Tomaszewski
> >>  wrote:
> >> > Hi Team,
> >> >
> >> > Should I send this question to NiFi dev list instead of NiFi users?
> >> >
> >> > Regards,
> >> > Mike
> >> >
> >> >>> 2018-02-15 17:42:29,901 ERROR [Timer-Driven Process Thread-11]
> >> hive.log Got exception: java.lang.NullPointerException null
> >> java.lang.NullPointerException: null
> >> >>>at
> >> org.apache.hadoop.hive.ql.security.authorization.plugin.Authorization
> >> MetaS
> >> toreFilterHook.getFilteredObjects(AuthorizationMetaStoreFilterHook.ja
> >> va:7
> >> 7)
> >> >>>at
> >> org.apache.hadoop.hive.ql.security.authorization.plugin.Authorization
> >> MetaS
> >> toreFilterHook.filterDatabases(AuthorizationMetaStoreFilterHook.java:
> >> 54)
> >> >>>at
> >>
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabases(Hiv
> >> e
> >> MetaStoreClient.java:1116)
> >> >>>at
> >>
> org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStor
> >> eClient.isOpen(HiveClientCache.java:469)
> >> >>>at sun.reflect.GeneratedMethodAccessor111.invoke(Unknown
> >> Source)
> >> >>>at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
> >> sorImpl.java:43)
> >> >>>at java.lang.reflect.Method.invoke(Method.java:498)
> >> >>>at
> >>
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(Retry
> >> in
> >> gMetaStoreClient.java:174)
> >> >>>at com.sun.proxy.$Proxy341.isOpen(Unknown Source)
> >> >>>at
> >> org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.j
> >> av
> >> a:269)
> >> >>>at
> >>
> org.apache.hive.hcatalog.common.HCatUtil.getHiveMetastoreClient(HCatU
> >> ti
> >> l.java:558)
> >> >>>at
> >> org.apache.hive.hcatalog.streaming.AbstractRecordWriter.(Abstra
> >> c

Re: PutHiveStreaming NullPointerException error

2018-02-19 Thread Matt Burgess
Mike,

Joe is correct, in order for Apache NiFi to interact with HDP Hive,
the Hive client dependencies need to be swapped out, as HDP Hive 1.x
components are not 100% compatible with Apache Hive 1.x components.
This can be done (in general) while building NiFi with Maven, by using
a vendor profile and overriding the "hive.version" and
"hive.hadoop.version" properties. There are currently 3 vendor
profiles (hortonworks, cloudera, mapr), and there are examples of how
to override the properties in the top-level pom.xml (in the vendor
profile section). An example to build with HDP Hive components is:

mvn clean install -Phortonworks -Dhive.version=1.2.1000.2.6.0.3-8
-Dhive.hadoop.version=2.7.3.2.6.0.3-8

You should be able to use this in the
nifi-nar-bundles/nifi-hive-bundle directory to only build the Hive
NAR(s), rather than a full rebuild. Then you can replace the Apache
NiFi versions of the Hive NARs in your distribution with these
vendor-specific ones.

Regards,
Matt

On Mon, Feb 19, 2018 at 8:21 AM, Joe Witt  wrote:
> Mike - ah i see . Thanks for clarifying.
>
> I think the issue is that to interact properly with the Hive version
> in HDP we have to swap out some dependencies at build time.
>
> Mattyb is definitely more knowledgeable so hopefully he can comment soon.
>
> On Mon, Feb 19, 2018 at 8:06 AM, Michal Tomaszewski
>  wrote:
>> Hi Joe,
>> Thanks for prompt answer.
>> As I wrote - I'm currently using NiFi 1.6 snapshot compiled from latest 
>> community sources https://github.com/apache/nifi.
>> I don't use Hortonworks' NiFi version. I verified Hortonworks' NiFi version 
>> only in order to be sure the problem is not connected with my compilation, 
>> as NiFi sends data to hive cluster on hortonworks.
>>
>> The problem is confirmed on community version of NiFi 1.6 (current 
>> snapshot), 1.5, 1.5 pre-RC and 1.4.
>>
>> Regards,
>> Mike
>>
>>
>>
>>
>>> -----Original Message-----
>>> From: Joe Witt [mailto:joe.w...@gmail.com]
>>> Sent: Monday, February 19, 2018 1:53 PM
>>> To: users@nifi.apache.org
>>> Subject: Re: PutHiveStreaming NullPointerException error
>>>
>>> Mike,
>>>
>>> Dev is fine but I think part of the difficultly for this group here is 
>>> you're
>>> referring to a vendor distribution the bundles apache nifi.
>>> The libraries involved are different than what we build/provide directly.  
>>> If
>>> you can recreate the same problem using apache nifi as we provide it as a
>>> community it would be easier for someone to help here and otherwise you
>>> might want to reach out to the vendor for help with that configuration.
>>>
>>> Thanks
>>>
>>> On Mon, Feb 19, 2018 at 6:13 AM, Michal Tomaszewski
>>>  wrote:
>>> > Hi Team,
>>> >
>>> > Should I send this question to NiFi dev list instead of NiFi users?
>>> >
>>> > Regards,
>>> > Mike
>>> >
>>> >>> 2018-02-15 17:42:29,901 ERROR [Timer-Driven Process Thread-11]
>>> hive.log Got exception: java.lang.NullPointerException null
>>> java.lang.NullPointerException: null
>>> >>>at
>>> org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaS
>>> toreFilterHook.getFilteredObjects(AuthorizationMetaStoreFilterHook.java:7
>>> 7)
>>> >>>at
>>> org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaS
>>> toreFilterHook.filterDatabases(AuthorizationMetaStoreFilterHook.java:54)
>>> >>>at
>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabases(Hive
>>> MetaStoreClient.java:1116)
>>> >>>at
>>> org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStor
>>> eClient.isOpen(HiveClientCache.java:469)
>>> >>>at sun.reflect.GeneratedMethodAccessor111.invoke(Unknown
>>> Source)
>>> >>>at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
>>> sorImpl.java:43)
>>> >>>at java.lang.reflect.Method.invoke(Method.java:498)
>>> >>>at
>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(Retryin
>>> gMetaStoreClient.java:174)
>>> >>>at com.sun.proxy.$Proxy341.isOpen(Unknown Source)
>>> >>>at
>>> org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.jav
>>> a:269)
>>> >&

Re: PutHiveStreaming NullPointerException error

2018-02-19 Thread Joe Witt
Mike - ah i see . Thanks for clarifying.

I think the issue is that to interact properly with the Hive version
in HDP we have to swap out some dependencies at build time.

Mattyb is definitely more knowledgeable so hopefully he can comment soon.

On Mon, Feb 19, 2018 at 8:06 AM, Michal Tomaszewski
 wrote:
> Hi Joe,
> Thanks for prompt answer.
> As I wrote - I'm currently using NiFi 1.6 snapshot compiled from latest 
> community sources https://github.com/apache/nifi.
> I don't use Hortonworks' NiFi version. I verified Hortonworks' NiFi version 
> only in order to be sure the problem is not connected with my compilation, as 
> NiFi sends data to hive cluster on hortonworks.
>
> The problem is confirmed on community version of NiFi 1.6 (current snapshot), 
> 1.5, 1.5 pre-RC and 1.4.
>
> Regards,
> Mike
>
>
>
>
>> -Original Message-
>> From: Joe Witt [mailto:joe.w...@gmail.com]
>> Sent: Monday, February 19, 2018 1:53 PM
>> To: users@nifi.apache.org
>> Subject: Re: PutHiveStreaming NullPointerException error
>>
>> Mike,
>>
>> Dev is fine but I think part of the difficultly for this group here is you're
>> referring to a vendor distribution the bundles apache nifi.
>> The libraries involved are different than what we build/provide directly.  If
>> you can recreate the same problem using apache nifi as we provide it as a
>> community it would be easier for someone to help here and otherwise you
>> might want to reach out to the vendor for help with that configuration.
>>
>> Thanks
>>
>> On Mon, Feb 19, 2018 at 6:13 AM, Michal Tomaszewski
>>  wrote:
>> > Hi Team,
>> >
>> > Should I send this question to NiFi dev list instead of NiFi users?
>> >
>> > Regards,
>> > Mike
>> >
>> >>> 2018-02-15 17:42:29,901 ERROR [Timer-Driven Process Thread-11]
>> hive.log Got exception: java.lang.NullPointerException null
>> java.lang.NullPointerException: null
>> >>>at
>> org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaS
>> toreFilterHook.getFilteredObjects(AuthorizationMetaStoreFilterHook.java:7
>> 7)
>> >>>at
>> org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaS
>> toreFilterHook.filterDatabases(AuthorizationMetaStoreFilterHook.java:54)
>> >>>at
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabases(Hive
>> MetaStoreClient.java:1116)
>> >>>at
>> org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStor
>> eClient.isOpen(HiveClientCache.java:469)
>> >>>at sun.reflect.GeneratedMethodAccessor111.invoke(Unknown
>> Source)
>> >>>at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
>> sorImpl.java:43)
>> >>>at java.lang.reflect.Method.invoke(Method.java:498)
>> >>>at
>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(Retryin
>> gMetaStoreClient.java:174)
>> >>>at com.sun.proxy.$Proxy341.isOpen(Unknown Source)
>> >>>at
>> org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.jav
>> a:269)
>> >>>at
>> org.apache.hive.hcatalog.common.HCatUtil.getHiveMetastoreClient(HCatUti
>> l.java:558)
>> >>>at
>> org.apache.hive.hcatalog.streaming.AbstractRecordWriter.(AbstractRe
>> cordWriter.java:94)
>> >>>at
>> org.apache.hive.hcatalog.streaming.StrictJsonWriter.(StrictJsonWriter.j
>> ava:82)
>> >>>at
>> org.apache.hive.hcatalog.streaming.StrictJsonWriter.(StrictJsonWriter.j
>> ava:60)
>> >>>at
>> org.apache.nifi.util.hive.HiveWriter.getRecordWriter(HiveWriter.java:85)
>> >>>at org.apache.nifi.util.hive.HiveWriter.(HiveWriter.java:72)
>> >>>at
>> org.apache.nifi.util.hive.HiveUtils.makeHiveWriter(HiveUtils.java:46)
>> >>>at
>> org.apache.nifi.processors.hive.PutHiveStreaming.makeHiveWriter(PutHiveS
>> treaming.java:1036)
>> >>>at
>> org.apache.nifi.processors.hive.PutHiveStreaming.getOrCreateWriter(PutHiv
>> eStreaming.java:947)
>> >>>at
>> org.apache.nifi.processors.hive.PutHiveStreaming.lambda$null$8(PutHiveStr
>> eaming.java:743)
>> >>>at
>> org.apache.nifi.processor.util.pattern.ExceptionHandler.execute(ExceptionH
>> andler.java:127)
>> >>>   

RE: PutHiveStreaming NullPointerException error

2018-02-19 Thread Michal Tomaszewski
Hi Joe,
Thanks for prompt answer.
As I wrote - I'm currently using NiFi 1.6 snapshot compiled from latest 
community sources https://github.com/apache/nifi.
I don't use Hortonworks' NiFi version. I verified Hortonworks' NiFi version 
only in order to be sure the problem is not connected with my compilation, as 
NiFi sends data to hive cluster on hortonworks.

The problem is confirmed on community version of NiFi 1.6 (current snapshot), 
1.5, 1.5 pre-RC and 1.4.

Regards,
Mike




> -Original Message-
> From: Joe Witt [mailto:joe.w...@gmail.com]
> Sent: Monday, February 19, 2018 1:53 PM
> To: users@nifi.apache.org
> Subject: Re: PutHiveStreaming NullPointerException error
>
> Mike,
>
> Dev is fine but I think part of the difficultly for this group here is you're
> referring to a vendor distribution the bundles apache nifi.
> The libraries involved are different than what we build/provide directly.  If
> you can recreate the same problem using apache nifi as we provide it as a
> community it would be easier for someone to help here and otherwise you
> might want to reach out to the vendor for help with that configuration.
>
> Thanks
>
> On Mon, Feb 19, 2018 at 6:13 AM, Michal Tomaszewski
>  wrote:
> > Hi Team,
> >
> > Should I send this question to NiFi dev list instead of NiFi users?
> >
> > Regards,
> > Mike
> >
> >>> 2018-02-15 17:42:29,901 ERROR [Timer-Driven Process Thread-11]
> hive.log Got exception: java.lang.NullPointerException null
> java.lang.NullPointerException: null
> >>>at
> org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaS
> toreFilterHook.getFilteredObjects(AuthorizationMetaStoreFilterHook.java:7
> 7)
> >>>at
> org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaS
> toreFilterHook.filterDatabases(AuthorizationMetaStoreFilterHook.java:54)
> >>>at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabases(Hive
> MetaStoreClient.java:1116)
> >>>at
> org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStor
> eClient.isOpen(HiveClientCache.java:469)
> >>>at sun.reflect.GeneratedMethodAccessor111.invoke(Unknown
> Source)
> >>>at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
> sorImpl.java:43)
> >>>at java.lang.reflect.Method.invoke(Method.java:498)
> >>>at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(Retryin
> gMetaStoreClient.java:174)
> >>>at com.sun.proxy.$Proxy341.isOpen(Unknown Source)
> >>>at
> org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.jav
> a:269)
> >>>at
> org.apache.hive.hcatalog.common.HCatUtil.getHiveMetastoreClient(HCatUti
> l.java:558)
> >>>at
> org.apache.hive.hcatalog.streaming.AbstractRecordWriter.(AbstractRe
> cordWriter.java:94)
> >>>at
> org.apache.hive.hcatalog.streaming.StrictJsonWriter.(StrictJsonWriter.j
> ava:82)
> >>>at
> org.apache.hive.hcatalog.streaming.StrictJsonWriter.(StrictJsonWriter.j
> ava:60)
> >>>at
> org.apache.nifi.util.hive.HiveWriter.getRecordWriter(HiveWriter.java:85)
> >>>at org.apache.nifi.util.hive.HiveWriter.(HiveWriter.java:72)
> >>>at
> org.apache.nifi.util.hive.HiveUtils.makeHiveWriter(HiveUtils.java:46)
> >>>at
> org.apache.nifi.processors.hive.PutHiveStreaming.makeHiveWriter(PutHiveS
> treaming.java:1036)
> >>>at
> org.apache.nifi.processors.hive.PutHiveStreaming.getOrCreateWriter(PutHiv
> eStreaming.java:947)
> >>>at
> org.apache.nifi.processors.hive.PutHiveStreaming.lambda$null$8(PutHiveStr
> eaming.java:743)
> >>>at
> org.apache.nifi.processor.util.pattern.ExceptionHandler.execute(ExceptionH
> andler.java:127)
> >>>at
> org.apache.nifi.processors.hive.PutHiveStreaming.lambda$onTrigger$12(Put
> HiveStreaming.java:740)
> >>>at
> org.apache.nifi.controller.repository.StandardProcessSession.read(Standard
> ProcessSession.java:2175)
> >>>at
> org.apache.nifi.controller.repository.StandardProcessSession.read(Standard
> ProcessSession.java:2145)
> >>>at
> org.apache.nifi.processors.hive.PutHiveStreaming.onTrigger(PutHiveStreami
> ng.java:694)
> >>>at
> org.apache.nifi.processors.hive.PutHiveStreaming.lambda$onTrigger$4(PutH
> iveStreaming.java:572)
> >>>at
> org.apache.nifi.processor.util.pattern.

Re: PutHiveStreaming NullPointerException error

2018-02-19 Thread Joe Witt
Mike,

Dev is fine but I think part of the difficultly for this group here is
you're referring to a vendor distribution the bundles apache nifi.
The libraries involved are different than what we build/provide
directly.  If you can recreate the same problem using apache nifi as
we provide it as a community it would be easier for someone to help
here and otherwise you might want to reach out to the vendor for help
with that configuration.

Thanks

On Mon, Feb 19, 2018 at 6:13 AM, Michal Tomaszewski
 wrote:
> Hi Team,
>
> Should I send this question to NiFi dev list instead of NiFi users?
>
> Regards,
> Mike
>
>>> 2018-02-15 17:42:29,901 ERROR [Timer-Driven Process Thread-11] hive.log Got 
>>> exception: java.lang.NullPointerException null 
>>> java.lang.NullPointerException: null
>>>at 
>>> org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook.getFilteredObjects(AuthorizationMetaStoreFilterHook.java:77)
>>>at 
>>> org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook.filterDatabases(AuthorizationMetaStoreFilterHook.java:54)
>>>at 
>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabases(HiveMetaStoreClient.java:1116)
>>>at 
>>> org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.isOpen(HiveClientCache.java:469)
>>>at sun.reflect.GeneratedMethodAccessor111.invoke(Unknown Source)
>>>at 
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>at java.lang.reflect.Method.invoke(Method.java:498)
>>>at 
>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:174)
>>>at com.sun.proxy.$Proxy341.isOpen(Unknown Source)
>>>at 
>>> org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.java:269)
>>>at 
>>> org.apache.hive.hcatalog.common.HCatUtil.getHiveMetastoreClient(HCatUtil.java:558)
>>>at 
>>> org.apache.hive.hcatalog.streaming.AbstractRecordWriter.(AbstractRecordWriter.java:94)
>>>at 
>>> org.apache.hive.hcatalog.streaming.StrictJsonWriter.(StrictJsonWriter.java:82)
>>>at 
>>> org.apache.hive.hcatalog.streaming.StrictJsonWriter.(StrictJsonWriter.java:60)
>>>at 
>>> org.apache.nifi.util.hive.HiveWriter.getRecordWriter(HiveWriter.java:85)
>>>at org.apache.nifi.util.hive.HiveWriter.(HiveWriter.java:72)
>>>at 
>>> org.apache.nifi.util.hive.HiveUtils.makeHiveWriter(HiveUtils.java:46)
>>>at 
>>> org.apache.nifi.processors.hive.PutHiveStreaming.makeHiveWriter(PutHiveStreaming.java:1036)
>>>at 
>>> org.apache.nifi.processors.hive.PutHiveStreaming.getOrCreateWriter(PutHiveStreaming.java:947)
>>>at 
>>> org.apache.nifi.processors.hive.PutHiveStreaming.lambda$null$8(PutHiveStreaming.java:743)
>>>at 
>>> org.apache.nifi.processor.util.pattern.ExceptionHandler.execute(ExceptionHandler.java:127)
>>>at 
>>> org.apache.nifi.processors.hive.PutHiveStreaming.lambda$onTrigger$12(PutHiveStreaming.java:740)
>>>at 
>>> org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:2175)
>>>at 
>>> org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:2145)
>>>at 
>>> org.apache.nifi.processors.hive.PutHiveStreaming.onTrigger(PutHiveStreaming.java:694)
>>>at 
>>> org.apache.nifi.processors.hive.PutHiveStreaming.lambda$onTrigger$4(PutHiveStreaming.java:572)
>>>at 
>>> org.apache.nifi.processor.util.pattern.PartialFunctions.onTrigger(PartialFunctions.java:114)
>>>at 
>>> org.apache.nifi.processor.util.pattern.RollbackOnFailure.onTrigger(RollbackOnFailure.java:184)
>>>at 
>>> org.apache.nifi.processors.hive.PutHiveStreaming.onTrigger(PutHiveStreaming.java:572)
>>>at 
>>> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1122)
>>>at 
>>> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147)
>>>at 
>>> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
>>>at 
>>> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:128)
>>>at 
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>>>at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
>>>at 
>>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
>>>at 
>>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
>>>at 
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>>>at 
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.

RE: PutHiveStreaming NullPointerException error

2018-02-19 Thread Michal Tomaszewski
Hi Team,

Should I send this question to NiFi dev list instead of NiFi users?

Regards,
Mike

>> 2018-02-15 17:42:29,901 ERROR [Timer-Driven Process Thread-11] hive.log Got 
>> exception: java.lang.NullPointerException null 
>> java.lang.NullPointerException: null
>>at 
>> org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook.getFilteredObjects(AuthorizationMetaStoreFilterHook.java:77)
>>at 
>> org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook.filterDatabases(AuthorizationMetaStoreFilterHook.java:54)
>>at 
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabases(HiveMetaStoreClient.java:1116)
>>at 
>> org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.isOpen(HiveClientCache.java:469)
>>at sun.reflect.GeneratedMethodAccessor111.invoke(Unknown Source)
>>at 
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>at java.lang.reflect.Method.invoke(Method.java:498)
>>at 
>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:174)
>>at com.sun.proxy.$Proxy341.isOpen(Unknown Source)
>>at 
>> org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.java:269)
>>at 
>> org.apache.hive.hcatalog.common.HCatUtil.getHiveMetastoreClient(HCatUtil.java:558)
>>at 
>> org.apache.hive.hcatalog.streaming.AbstractRecordWriter.(AbstractRecordWriter.java:94)
>>at 
>> org.apache.hive.hcatalog.streaming.StrictJsonWriter.(StrictJsonWriter.java:82)
>>at 
>> org.apache.hive.hcatalog.streaming.StrictJsonWriter.(StrictJsonWriter.java:60)
>>at 
>> org.apache.nifi.util.hive.HiveWriter.getRecordWriter(HiveWriter.java:85)
>>at org.apache.nifi.util.hive.HiveWriter.(HiveWriter.java:72)
>>at 
>> org.apache.nifi.util.hive.HiveUtils.makeHiveWriter(HiveUtils.java:46)
>>at 
>> org.apache.nifi.processors.hive.PutHiveStreaming.makeHiveWriter(PutHiveStreaming.java:1036)
>>at 
>> org.apache.nifi.processors.hive.PutHiveStreaming.getOrCreateWriter(PutHiveStreaming.java:947)
>>at 
>> org.apache.nifi.processors.hive.PutHiveStreaming.lambda$null$8(PutHiveStreaming.java:743)
>>at 
>> org.apache.nifi.processor.util.pattern.ExceptionHandler.execute(ExceptionHandler.java:127)
>>at 
>> org.apache.nifi.processors.hive.PutHiveStreaming.lambda$onTrigger$12(PutHiveStreaming.java:740)
>>at 
>> org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:2175)
>>at 
>> org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:2145)
>>at 
>> org.apache.nifi.processors.hive.PutHiveStreaming.onTrigger(PutHiveStreaming.java:694)
>>at 
>> org.apache.nifi.processors.hive.PutHiveStreaming.lambda$onTrigger$4(PutHiveStreaming.java:572)
>>at 
>> org.apache.nifi.processor.util.pattern.PartialFunctions.onTrigger(PartialFunctions.java:114)
>>at 
>> org.apache.nifi.processor.util.pattern.RollbackOnFailure.onTrigger(RollbackOnFailure.java:184)
>>at 
>> org.apache.nifi.processors.hive.PutHiveStreaming.onTrigger(PutHiveStreaming.java:572)
>>at 
>> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1122)
>>at 
>> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147)
>>at 
>> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
>>at 
>> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:128)
>>at 
>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>>at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
>>at 
>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
>>at 
>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
>>at 
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>>at 
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>>at java.lang.Thread.run(Thread.java:748)
 Uwaga: Treść niniejszej wiadomości 
może być poufna i objęta zakazem jej ujawniania. Jeśli czytelnik tej wiadomości 
nie jest jej zamierzonym adresatem, pracownikiem lub pośrednikiem upoważnionym 
do jej przekazania adresatowi, informujemy że wszelkie rozprowadzanie, 
rozpowszechnianie lub powielanie niniejszej wiadomości jest zabronione. Jeśli 
otrzymałeś tę wiadomość omyłkowo, proszę bezzwłocznie odesłać ją nadawcy, a 
samą wiadomość usunąć z komputera. Dziękujemy. ___