>From doing some Googling it seems like the problem is similar to the one
described here where Hive could no longer talk to HBase after installing
Phoenix:
https://community.hortonworks.com/questions/1652/how-can-i-query-hbase-from-hive.html

The solution in that scenario was to add a phoenix jar to Hive's classpath,
which makes me think we would somehow have to make a phoenix jar available
on NiFi's classpath for the HBase Client Service.

I don't know enough about Phoenix to say for sure, but I created this JIRA
to capture the issue:
https://issues.apache.org/jira/browse/NIFI-1712


On Thu, Mar 31, 2016 at 2:28 PM, Guillaume Pool <[email protected]> wrote:

> Hi,
>
>
>
> Yes, here it is
>
>
>
>   <configuration>
>
>
>
>     <property>
>
>       <name>fs.defaultFS</name>
>
>       <value>hdfs://supergrpcluster</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>fs.trash.interval</name>
>
>       <value>360</value>
>
>     </property>
>
>
>
>     <property>
>
>
> <name>ha.failover-controller.active-standby-elector.zk.op.retries</name>
>
>       <value>120</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>ha.zookeeper.quorum</name>
>
>       <value>sv-htndp2.hdp.supergrp.net:2181,
> sv-htndp1.hdp.supergrp.net:2181,sv-htndp3.hdp.supergrp.net:2181</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>hadoop.http.authentication.simple.anonymous.allowed</name>
>
>       <value>true</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>hadoop.proxyuser.admin.groups</name>
>
>       <value>*</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>hadoop.proxyuser.admin.hosts</name>
>
>       <value>*</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>hadoop.proxyuser.hcat.groups</name>
>
>       <value>users</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>hadoop.proxyuser.hcat.hosts</name>
>
>       <value>sv-htnmn2.hdp.supergrp.net</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>hadoop.proxyuser.hdfs.groups</name>
>
>       <value>*</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>hadoop.proxyuser.hdfs.hosts</name>
>
>       <value>*</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>hadoop.proxyuser.hive.groups</name>
>
>       <value>*</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>hadoop.proxyuser.hive.hosts</name>
>
>       <value>sv-htnmn2.hdp.supergrp.net</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>hadoop.proxyuser.HTTP.groups</name>
>
>       <value>users</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>hadoop.proxyuser.HTTP.hosts</name>
>
>       <value>sv-htnmn2.hdp.supergrp.net</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>hadoop.proxyuser.knox.groups</name>
>
>       <value>users</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>hadoop.proxyuser.knox.hosts</name>
>
>       <value>sv-htncmn.hdp.supergrp.net</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>hadoop.proxyuser.oozie.groups</name>
>
>       <value>*</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>hadoop.proxyuser.oozie.hosts</name>
>
>       <value>sv-htncmn.hdp.supergrp.net</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>hadoop.proxyuser.root.groups</name>
>
>       <value>*</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>hadoop.proxyuser.root.hosts</name>
>
>       <value>*</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>hadoop.proxyuser.yarn.groups</name>
>
>       <value>*</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>hadoop.proxyuser.yarn.hosts</name>
>
>       <value>sv-htnmn1.hdp.supergrp.net</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>hadoop.security.auth_to_local</name>
>
>       <value>RULE:[1:$1@$0]([email protected])s/.*/ambari-qa/
>
> RULE:[1:$1@$0]([email protected])s/.*/hbase/
>
> RULE:[1:$1@$0]([email protected])s/.*/hdfs/
>
> RULE:[1:$1@$0]([email protected])s/.*/spark/
>
> RULE:[1:$1@$0](.*@HDP.SUPERGRP.NET)s/@.*//
>
> RULE:[2:$1@$0]([email protected])s/.*/ams/
>
> RULE:[2:$1@$0]([email protected])s/.*/ams/
>
> RULE:[2:$1@$0]([email protected])s/.*/ams/
>
> RULE:[2:$1@$0]([email protected])s/.*/ams/
>
> RULE:[2:$1@$0]([email protected])s/.*/hdfs/
>
> RULE:[2:$1@$0]([email protected])s/.*/hbase/
>
> RULE:[2:$1@$0]([email protected])s/.*/hive/
>
> RULE:[2:$1@$0]([email protected])s/.*/mapred/
>
> RULE:[2:$1@$0]([email protected])s/.*/hdfs/
>
> RULE:[2:$1@$0]([email protected])s/.*/knox/
>
> RULE:[2:$1@$0]([email protected])s/.*/yarn/
>
> RULE:[2:$1@$0]([email protected])s/.*/hdfs/
>
> RULE:[2:$1@$0]([email protected])s/.*/oozie/
>
> RULE:[2:$1@$0]([email protected])s/.*/yarn/
>
> RULE:[2:$1@$0]([email protected])s/.*/yarn/
>
> DEFAULT</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>hadoop.security.authentication</name>
>
>       <value>kerberos</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>hadoop.security.authorization</name>
>
>       <value>true</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>hadoop.security.key.provider.path</name>
>
>       <value></value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>io.compression.codecs</name>
>
>
> <value>org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.SnappyCodec</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>io.file.buffer.size</name>
>
>       <value>131072</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>io.serializations</name>
>
>       <value>org.apache.hadoop.io.serializer.WritableSerialization</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>ipc.client.connect.max.retries</name>
>
>       <value>50</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>ipc.client.connection.maxidletime</name>
>
>       <value>30000</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>ipc.client.idlethreshold</name>
>
>       <value>8000</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>ipc.server.tcpnodelay</name>
>
>       <value>true</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>mapreduce.jobtracker.webinterface.trusted</name>
>
>       <value>false</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>net.topology.script.file.name</name>
>
>       <value>/etc/hadoop/conf/topology_script.py</value>
>
>     </property>
>
>
>
>   </configuration>
>
>
>
> Thanks
>
>
>
> *From: *Jeff Lord <[email protected]>
> *Sent: *Thursday, 31 March 2016 08:16 PM
> *To: *[email protected]
> *Subject: *Re: Can't connect to Secure HBase cluster
>
>
> Do you have a core-site.xml in your config?
>
> On Thu, Mar 31, 2016 at 4:27 AM, Guillaume Pool <[email protected]> wrote:
>
> Hi,
>
>
>
> I am trying to make a connection to a secured cluster that has phoenix
> installed.
>
>
>
> I am running HDP 2.3.2 and NiFi 0.6.0
>
>
>
> Getting the following error on trying to enable HBase_1_1_2_ClientService
>
>
>
> 2016-03-31 13:24:23,916 INFO [StandardProcessScheduler Thread-5]
> o.a.nifi.hbase.HBase_1_1_2_ClientService
> HBase_1_1_2_ClientService[id=e7e9b2ed-d336-34be-acb4-6c8b60c735c2] HBase
> Security Enabled, logging in as principal [email protected] with
> keytab /app/env/nifi.keytab
>
> 2016-03-31 13:24:23,984 WARN [StandardProcessScheduler Thread-5]
> org.apache.hadoop.util.NativeCodeLoader Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
>
> 2016-03-31 13:24:24,101 INFO [StandardProcessScheduler Thread-5]
> o.a.nifi.hbase.HBase_1_1_2_ClientService
> HBase_1_1_2_ClientService[id=e7e9b2ed-d336-34be-acb4-6c8b60c735c2]
> Successfully logged in as principal [email protected] with keytab
> /app/env/nifi.keytab
>
> 2016-03-31 13:24:24,177 ERROR [StandardProcessScheduler Thread-5]
> o.a.n.c.s.StandardControllerServiceNode
> HBase_1_1_2_ClientService[id=e7e9b2ed-d336-34be-acb4-6c8b60c735c2] Failed
> to invoke @OnEnabled method due to java.io.IOException:
> java.lang.reflect.InvocationTargetException
>
> 2016-03-31 13:24:24,182 ERROR [StandardProcessScheduler Thread-5]
> o.a.n.c.s.StandardControllerServiceNode
>
> java.io.IOException: java.lang.reflect.InvocationTargetException
>
>         at
> org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:240)
> ~[hbase-client-1.1.2.jar:1.1.2]
>
>         at
> org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:218)
> ~[hbase-client-1.1.2.jar:1.1.2]
>
>         at
> org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:119)
> ~[hbase-client-1.1.2.jar:1.1.2]
>
>         at
> org.apache.nifi.hbase.HBase_1_1_2_ClientService$1.run(HBase_1_1_2_ClientService.java:215)
> ~[nifi-hbase_1_1_2-client-service-0.6.0.jar:0.6.0]
>
>         at
> org.apache.nifi.hbase.HBase_1_1_2_ClientService$1.run(HBase_1_1_2_ClientService.java:212)
> ~[nifi-hbase_1_1_2-client-service-0.6.0.jar:0.6.0]
>
>         at java.security.AccessController.doPrivileged(Native Method)
> ~[na:1.8.0_71]
>
>         at javax.security.auth.Subject.doAs(Subject.java:422)
> ~[na:1.8.0_71]
>
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1656)
> ~[hadoop-common-2.6.2.jar:na]
>
>         at
> org.apache.nifi.hbase.HBase_1_1_2_ClientService.createConnection(HBase_1_1_2_ClientService.java:212)
> ~[nifi-hbase_1_1_2-client-service-0.6.0.jar:0.6.0]
>
>         at
> org.apache.nifi.hbase.HBase_1_1_2_ClientService.onEnabled(HBase_1_1_2_ClientService.java:161)
> ~[nifi-hbase_1_1_2-client-service-0.6.0.jar:0.6.0]
>
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> ~[na:1.8.0_71]
>
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> ~[na:1.8.0_71]
>
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> ~[na:1.8.0_71]
>
>         at java.lang.reflect.Method.invoke(Method.java:497) ~[na:1.8.0_71]
>
>         at
> org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:137)
> ~[na:na]
>
>         at
> org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:125)
> ~[na:na]
>
>         at
> org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:70)
> ~[na:na]
>
>         at
> org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotation(ReflectionUtils.java:47)
> ~[na:na]
>
>         at
> org.apache.nifi.controller.service.StandardControllerServiceNode$1.run(StandardControllerServiceNode.java:285)
> ~[na:na]
>
>         at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> [na:1.8.0_71]
>
>         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> [na:1.8.0_71]
>
>         at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
> [na:1.8.0_71]
>
>         at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
> [na:1.8.0_71]
>
>         at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> [na:1.8.0_71]
>
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> [na:1.8.0_71]
>
>         at java.lang.Thread.run(Thread.java:745) [na:1.8.0_71]
>
> Caused by: java.lang.reflect.InvocationTargetException: null
>
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method) ~[na:1.8.0_71]
>
>         at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> ~[na:1.8.0_71]
>
>         at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> ~[na:1.8.0_71]
>
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
> ~[na:1.8.0_71]
>
>         at
> org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238)
> ~[hbase-client-1.1.2.jar:1.1.2]
>
>         ... 25 common frames omitted
>
> Caused by: java.lang.UnsupportedOperationException: Unable to find
> org.apache.hadoop.hbase.ipc.controller.ServerRpcControllerFactory
>
>         at
> org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:36)
> ~[hbase-common-1.1.2.jar:1.1.2]
>
>         at
> org.apache.hadoop.hbase.ipc.RpcControllerFactory.instantiate(RpcControllerFactory.java:58)
> ~[hbase-client-1.1.2.jar:1.1.2]
>
>         at
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.createAsyncProcess(ConnectionManager.java:2242)
> ~[hbase-client-1.1.2.jar:1.1.2]
>
>         at
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:690)
> ~[hbase-client-1.1.2.jar:1.1.2]
>
>         at
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:630)
> ~[hbase-client-1.1.2.jar:1.1.2]
>
>         ... 30 common frames omitted
>
> Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.hbase.ipc.controller.ServerRpcControllerFactory
>
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> ~[na:1.8.0_71]
>
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> ~[na:1.8.0_71]
>
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> ~[na:1.8.0_71]
>
>         at java.lang.Class.forName0(Native Method) ~[na:1.8.0_71]
>
>         at java.lang.Class.forName(Class.java:264) ~[na:1.8.0_71]
>
>         at
> org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:32)
> ~[hbase-common-1.1.2.jar:1.1.2]
>
>         ... 34 common frames omitted
>
> 2016-03-31 13:24:24,184 ERROR [StandardProcessScheduler Thread-5]
> o.a.n.c.s.StandardControllerServiceNode Failed to invoke @OnEnabled method
> of HBase_1_1_2_ClientService[id=e7e9b2ed-d336-34be-acb4-6c8b60c735c2] due
> to java.io.IOException: java.lang.reflect.InvocationTargetException
>
>
>
> Is anyone experiencing similar issues.
>
>
>
> There is a know issue in HDP 2.3.2 where phoenix causes issues with hbase
> and mapreduce. Could it be a similar issue?
>
>
>
> Thanks
>
>
>

Reply via email to