[
https://issues.apache.org/jira/browse/NIFI-7831?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17264920#comment-17264920
]
Bryan Bende commented on NIFI-7831:
-----------------------------------
[~matagyula] [~rkrist] thank you for following up and reporting your findings.
As far as replacing NARs to do tests, the KeytabCredentialService actual has no
real logic, it is just a holder of two string values for the principal and
keytab location. It is up to each processor/service that uses it to do
something with those values, so you would really be looking to replace one of
the HBase NARs and/or the HDFS libraries NAR and HDFS processors NAR.
I am fairly certain that the root cause of a few of these issues is that in
1.12.0 when we introduced support for password based Kerberos, the changes
ended up no longer setting the static logged in user in Hadoop's
UserGroupInformation class, and many parts of the Hadoop/Hbase client code
depend on this. That was fixed on master with this change:
[https://github.com/apache/nifi/pull/4643]
That code is in nifi-hadoop-utils which would be bundled into the following
NARs...
{code:java}
work/nar//extensions/nifi-hbase_2-client-service-nar-1.13.0-SNAPSHOT.nar-unpacked/NAR-INF/bundled-dependencies/nifi-hadoop-utils-1.13.0-SNAPSHOT.jar
work/nar//extensions/nifi-hadoop-dbcp-service-nar-1.13.0-SNAPSHOT.nar-unpacked/NAR-INF/bundled-dependencies/nifi-hadoop-utils-1.13.0-SNAPSHOT.jar
work/nar//extensions/nifi-livy-nar-1.13.0-SNAPSHOT.nar-unpacked/NAR-INF/bundled-dependencies/nifi-hadoop-utils-1.13.0-SNAPSHOT.jar
work/nar//extensions/nifi-hadoop-nar-1.13.0-SNAPSHOT.nar-unpacked/NAR-INF/bundled-dependencies/nifi-hadoop-utils-1.13.0-SNAPSHOT.jar
work/nar//extensions/nifi-parquet-nar-1.13.0-SNAPSHOT.nar-unpacked/NAR-INF/bundled-dependencies/nifi-hadoop-utils-1.13.0-SNAPSHOT.jar
work/nar//extensions/nifi-hbase_1_1_2-client-service-nar-1.13.0-SNAPSHOT.nar-unpacked/NAR-INF/bundled-dependencies/nifi-hadoop-utils-1.13.0-SNAPSHOT.jar
work/nar//extensions/nifi-hive-nar-1.13.0-SNAPSHOT.nar-unpacked/NAR-INF/bundled-dependencies/nifi-hadoop-utils-1.13.0-SNAPSHOT.jar
{code}
> KeytabCredentialsService not working with HBase Clients
> -------------------------------------------------------
>
> Key: NIFI-7831
> URL: https://issues.apache.org/jira/browse/NIFI-7831
> Project: Apache NiFi
> Issue Type: Bug
> Affects Versions: 1.12.0
> Reporter: Manuel Navarro
> Assignee: Tamas Palfy
> Priority: Major
> Fix For: 1.13.0
>
>
> HBase Client (both 1.x and 2.x) is not able to renew ticket after expiration
> with KeytabCredentialsService configured (same behaviour with principal and
> password configured directly in the controller service). The same
> KeytabCredentialsService works ok with Hive and Hbase clients configured in
> the same NIFI cluster.
> Note that the same configuration works ok in version 1.11 (error start to
> appear after upgrade from 1.11 to 1.12).
> After 24hours (time renewal period in our case), the following error appears
> using HBase_2_ClientServices + HBase_2_ClientMapCacheService :
> {code:java}
> 2020-09-17 09:00:27,014 ERROR [Relogin service.Chore.1]
> org.apache.hadoop.hbase.AuthUtil Got exception while trying to refresh
> credentials: loginUserFromKeyTab must be done first java.io.IOException:
> loginUserFromKeyTab must be done first at
> org.apache.hadoop.security.UserGroupInformation.reloginFromKeytab(UserGroupInformation.java:1194)
> at
> org.apache.hadoop.security.UserGroupInformation.checkTGTAndReloginFromKeytab(UserGroupInformation.java:1125)
> at org.apache.hadoop.hbase.AuthUtil$1.chore(AuthUtil.java:206) at
> org.apache.hadoop.hbase.ScheduledChore.run(ScheduledChore.java:186) at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at
> java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at java.lang.Thread.run(Thread.java:748)
> {code}
>
> With HBase_1_1_2_ClientServices + HBase_1_1_2_ClientMapCacheService the
> following error appears:
>
> {code:java}
> 2020-09-22 12:18:37,184 WARN [hconnection-0x55d9d8d1-shared--pool3-t769]
> o.a.hadoop.hbase.ipc.AbstractRpcClient Exception encountered while connecting
> to the server : javax.security.sasl.SaslException: GSS initiate failed
> [Caused by GSSException: No valid credentials provided (Mechanism level:
> Failed to find any Kerberos tgt)] 2020-09-22 12:18:37,197 ERROR
> [hconnection-0x55d9d8d1-shared--pool3-t769]
> o.a.hadoop.hbase.ipc.AbstractRpcClient SASL authentication failed. The most
> likely cause is missing or invalid credentials. Consider 'kinit'.
> javax.security.sasl.SaslException: GSS initiate failed at
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
> at
> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)
> at
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:612)
> at
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$600(RpcClientImpl.java:157)
> at
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:738)
> at
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:735)
> at java.security.AccessController.doPrivileged(Native Method) at
> javax.security.auth.Subject.doAs(Subject.java:422) at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
> at
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:735)
> at
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:897)
> at
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:866)
> at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1208)
> at
> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:223)
> at
> org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:328)
> at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.multi(ClientProtos.java:32879)
> at
> org.apache.hadoop.hbase.client.MultiServerCallable.call(MultiServerCallable.java:128)
> at
> org.apache.hadoop.hbase.client.MultiServerCallable.call(MultiServerCallable.java:53)
> at
> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:210)
> at
> org.apache.hadoop.hbase.client.AsyncProcess$AsyncRequestFutureImpl$SingleServerRequestRunnable.run(AsyncProcess.java:723)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> at java.util.concurrent.FutureTask.run(FutureTask.java:266) at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at java.lang.Thread.run(Thread.java:748) Caused by:
> org.ietf.jgss.GSSException: No valid credentials provided (Mechanism level:
> Failed to find any Kerberos tgt)
> {code}
>
> Environment: Apache NIFI 1.12, RHEL 7.7, openjdk version "1.8.0_222-ea"
> Regards!
--
This message was sent by Atlassian Jira
(v8.3.4#803005)