[
https://issues.apache.org/jira/browse/NIFI-1712?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15226376#comment-15226376
]
Josh Elser commented on NIFI-1712:
----------------------------------
You're right that it does perform a Kerberos login correctly, but it fails to
ever talk to ZooKeeper
{noformat}
2016-04-05 16:06:58,185 INFO [StandardProcessScheduler
Thread-6-SendThread(sv-htndp2.hdp.supergrp.net:2181)]
org.apache.zookeeper.ClientCnxn Session establishment complete on server
sv-htndp2.hdp.supergrp.net/172.27.0.107:2181, sessionid = 0x2537c0236e90593,
negotiated timeout = 40000
2016-04-05 16:06:58,190 ERROR [StandardProcessScheduler
Thread-6-SendThread(sv-htndp2.hdp.supergrp.net:2181)]
o.a.zookeeper.client.ZooKeeperSaslClient An error:
(java.security.PrivilegedActionException: javax.security.sasl.SaslException:
GSS initiate failed [Caused by GSSException: No valid credentials provided
(Mechanism level: Fail to create credential. (63) - No service creds)])
occurred when evaluating Zookeeper Quorum Member's received SASL token.
Zookeeper Client will go to AUTH_FAILED state.
2016-04-05 16:06:58,190 ERROR [StandardProcessScheduler
Thread-6-SendThread(sv-htndp2.hdp.supergrp.net:2181)]
org.apache.zookeeper.ClientCnxn SASL authentication with Zookeeper Quorum
member failed: javax.security.sasl.SaslException: An error:
(java.security.PrivilegedActionException: javax.security.sasl.SaslException:
GSS initiate failed [Caused by GSSException: No valid credentials provided
(Mechanism level: Fail to create credential. (63) - No service creds)])
occurred when evaluating Zookeeper Quorum Member's received SASL token.
Zookeeper Client will go to AUTH_FAILED state.
{noformat}
Specifically, this is concerning.
{noformat}
Mechanism level: Fail to create credential. (63) - No service creds
{noformat}
IIRC, this is telling you that the client library cannot find the service
ticket for your KDC with the realm HDP.SUPERGRP.NET. In other words, it doesn't
know about krbtgt/[email protected]. Is {{/etc/krb5.conf}}
correct on your system? Do the realms match for the KDC {{/etc/krb5.conf}}
points to and this principal ([email protected])? What is the principal
that your ZooKeeper instance is using? Does the principal for ZooKeeper exist
in the same realm as NiFi?
> HBaseClientService unable to connect when Phoenix is installed
> --------------------------------------------------------------
>
> Key: NIFI-1712
> URL: https://issues.apache.org/jira/browse/NIFI-1712
> Project: Apache NiFi
> Issue Type: Bug
> Reporter: Bryan Bende
> Priority: Minor
>
> A user reported running HDP 2.3.2 with Phoenix installed, and NiFi 0.6.0,
> with the following error:
>
> 2016-03-31 13:24:23,916 INFO [StandardProcessScheduler Thread-5]
> o.a.nifi.hbase.HBase_1_1_2_ClientService
> HBase_1_1_2_ClientService[id=e7e9b2ed-d336-34be-acb4-6c8b60c735c2] HBase
> Security Enabled, logging in as principal [email protected] with keytab
> /app/env/nifi.keytab
> 2016-03-31 13:24:23,984 WARN [StandardProcessScheduler Thread-5]
> org.apache.hadoop.util.NativeCodeLoader Unable to load native-hadoop library
> for your platform... using builtin-java classes where applicable
> 2016-03-31 13:24:24,101 INFO [StandardProcessScheduler Thread-5]
> o.a.nifi.hbase.HBase_1_1_2_ClientService
> HBase_1_1_2_ClientService[id=e7e9b2ed-d336-34be-acb4-6c8b60c735c2]
> Successfully logged in as principal [email protected] with keytab
> /app/env/nifi.keytab
> 2016-03-31 13:24:24,177 ERROR [StandardProcessScheduler Thread-5]
> o.a.n.c.s.StandardControllerServiceNode
> HBase_1_1_2_ClientService[id=e7e9b2ed-d336-34be-acb4-6c8b60c735c2] Failed to
> invoke @OnEnabled method due to java.io.IOException:
> java.lang.reflect.InvocationTargetException
> 2016-03-31 13:24:24,182 ERROR [StandardProcessScheduler Thread-5]
> o.a.n.c.s.StandardControllerServiceNode
> java.io.IOException: java.lang.reflect.InvocationTargetException
> at
> org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:240)
> ~[hbase-client-1.1.2.jar:1.1.2]
> at
> org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:218)
> ~[hbase-client-1.1.2.jar:1.1.2]
> at
> org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:119)
> ~[hbase-client-1.1.2.jar:1.1.2]
> at
> org.apache.nifi.hbase.HBase_1_1_2_ClientService$1.run(HBase_1_1_2_ClientService.java:215)
> ~[nifi-hbase_1_1_2-client-service-0.6.0.jar:0.6.0]
> at
> org.apache.nifi.hbase.HBase_1_1_2_ClientService$1.run(HBase_1_1_2_ClientService.java:212)
> ~[nifi-hbase_1_1_2-client-service-0.6.0.jar:0.6.0]
> at java.security.AccessController.doPrivileged(Native Method)
> ~[na:1.8.0_71]
> at javax.security.auth.Subject.doAs(Subject.java:422) ~[na:1.8.0_71]
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1656)
> ~[hadoop-common-2.6.2.jar:na]
> at
> org.apache.nifi.hbase.HBase_1_1_2_ClientService.createConnection(HBase_1_1_2_ClientService.java:212)
> ~[nifi-hbase_1_1_2-client-service-0.6.0.jar:0.6.0]
> at
> org.apache.nifi.hbase.HBase_1_1_2_ClientService.onEnabled(HBase_1_1_2_ClientService.java:161)
> ~[nifi-hbase_1_1_2-client-service-0.6.0.jar:0.6.0]
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> ~[na:1.8.0_71]
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> ~[na:1.8.0_71]
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> ~[na:1.8.0_71]
> at java.lang.reflect.Method.invoke(Method.java:497) ~[na:1.8.0_71]
> at
> org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:137)
> ~[na:na]
> at
> org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:125)
> ~[na:na]
> at
> org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:70)
> ~[na:na]
> at
> org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotation(ReflectionUtils.java:47)
> ~[na:na]
> at
> org.apache.nifi.controller.service.StandardControllerServiceNode$1.run(StandardControllerServiceNode.java:285)
> ~[na:na]
> at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> [na:1.8.0_71]
> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> [na:1.8.0_71]
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
> [na:1.8.0_71]
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
> [na:1.8.0_71]
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> [na:1.8.0_71]
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> [na:1.8.0_71]
> at java.lang.Thread.run(Thread.java:745) [na:1.8.0_71]
> Caused by: java.lang.reflect.InvocationTargetException: null
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method) ~[na:1.8.0_71]
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> ~[na:1.8.0_71]
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> ~[na:1.8.0_71]
> at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
> ~[na:1.8.0_71]
> at
> org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238)
> ~[hbase-client-1.1.2.jar:1.1.2]
> ... 25 common frames omitted
> Caused by: java.lang.UnsupportedOperationException: Unable to find
> org.apache.hadoop.hbase.ipc.controller.ServerRpcControllerFactory
> at
> org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:36)
> ~[hbase-common-1.1.2.jar:1.1.2]
> at
> org.apache.hadoop.hbase.ipc.RpcControllerFactory.instantiate(RpcControllerFactory.java:58)
> ~[hbase-client-1.1.2.jar:1.1.2]
> at
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.createAsyncProcess(ConnectionManager.java:2242)
> ~[hbase-client-1.1.2.jar:1.1.2]
> at
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:690)
> ~[hbase-client-1.1.2.jar:1.1.2]
> at
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:630)
> ~[hbase-client-1.1.2.jar:1.1.2]
> ... 30 common frames omitted
> Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.hbase.ipc.controller.ServerRpcControllerFactory
> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> ~[na:1.8.0_71]
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> ~[na:1.8.0_71]
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> ~[na:1.8.0_71]
> at java.lang.Class.forName0(Native Method) ~[na:1.8.0_71]
> at java.lang.Class.forName(Class.java:264) ~[na:1.8.0_71]
> at
> org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:32)
> ~[hbase-common-1.1.2.jar:1.1.2]
> ... 34 common frames omitted
> 2016-03-31 13:24:24,184 ERROR [StandardProcessScheduler Thread-5]
> o.a.n.c.s.StandardControllerServiceNode Failed to invoke @OnEnabled method of
> HBase_1_1_2_ClientService[id=e7e9b2ed-d336-34be-acb4-6c8b60c735c2] due to
> java.io.IOException: java.lang.reflect.InvocationTargetException
>
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)