I should add that the HDFS processors do support communicating with a
kerberized HDFS.

There are properties on the processors to configure the keytab and
principal, as well as the following property in nifi.properties:

nifi.kerberos.krb5.file=


Thanks,

Bryan

On Mon, Feb 8, 2016 at 11:29 AM, Bryan Bende <[email protected]> wrote:

> Laxman,
>
> The current HBase integration does not support Kerberized HBase installs
> at the moment.
>
> I created a JIRA to track this:
> https://issues.apache.org/jira/browse/NIFI-1488
>
> -Bryan
>
> On Mon, Feb 8, 2016 at 10:36 AM, <[email protected]> wrote:
>
>> Hi,
>>
>> I have configured a Hbase client and kinit on the machine the that NIFI
>> is running on, but I get the following error:
>>
>> 016-02-08 15:30:12,475 WARN [pool-26-thread-1]
>> o.a.hadoop.hbase.ipc.AbstractRpcClient Exception encountered while
>> connecting to the server : javax.security.sasl.SaslException: GSS initiate
>> failed [Caused by GSSException: No valid credentials provided (Mechanism
>> level: Failed to find any Kerberos tgt)]
>> 2016-02-08 15:30:12,476 ERROR [pool-26-thread-1]
>> o.a.hadoop.hbase.ipc.AbstractRpcClient SASL authentication failed. The most
>> likely cause is missing or invalid credentials. Consider 'kinit'.
>> javax.security.sasl.SaslException: GSS initiate failed
>>         at
>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(Unknown
>> Source) ~[na:1.8.0_71]
>>         at
>> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)
>> ~[hbase-client-1.1.2.jar:1.1.2]
>>         at
>> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:642)
>> [hbase-client-1.1.2.jar:1.1.2]
>>         at
>> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$600(RpcClientImpl.java:166)
>> [hbase-client-1.1.2.jar:1.1.2]
>>         at
>> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:769)
>> ~[hbase-client-1.1.2.jar:1.1.2]
>>         at
>> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:766)
>> ~[hbase-client-1.1.2.jar:1.1.2]
>>         at java.security.AccessController.doPrivileged(Native Method)
>> ~[na:1.8.0_71]
>>         at javax.security.auth.Subject.doAs(Unknown Source) ~[na:1.8.0_71]
>>         at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1656)
>> ~[hadoop-common-2.6.2.jar:na]
>>         at
>> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:766)
>> [hbase-client-1.1.2.jar:1.1.2]
>>         at
>> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:920)
>> [hbase-client-1.1.2.jar:1.1.2]
>>         at
>> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:889)
>> [hbase-client-1.1.2.jar:1.1.2]
>>         at
>> org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1222)
>> [hbase-client-1.1.2.jar:1.1.2]
>>         at
>> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
>> [hbase-client-1.1.2.jar:1.1.2]
>>         at
>> org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287)
>> [hbase-client-1.1.2.jar:1.1.2]
>>         at
>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:50918)
>> [hbase-protocol-1.1.2.jar:1.1.2]
>>         at
>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1564)
>> [hbase-client-1.1.2.jar:1.1.2]
>>         at
>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1502)
>> [hbase-client-1.1.2.jar:1.1.2]
>>         at
>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1524)
>> [hbase-client-1.1.2.jar:1.1.2]
>>         at
>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1553)
>> [hbase-client-1.1.2.jar:1.1.2]
>>         at
>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1704)
>> [hbase-client-1.1.2.jar:1.1.2]
>>         at
>> org.apache.hadoop.hbase.client.MasterCallable.prepare(MasterCallable.java:38)
>> [hbase-client-1.1.2.jar:1.1.2]
>>         at
>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124)
>> [hbase-client-1.1.2.jar:1.1.2]
>>         at
>> org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:3917)
>> [hbase-client-1.1.2.jar:1.1.2]
>>         at
>> org.apache.hadoop.hbase.client.HBaseAdmin.listTableNames(HBaseAdmin.java:413)
>> [hbase-client-1.1.2.jar:1.1.2]
>>         at
>> org.apache.hadoop.hbase.client.HBaseAdmin.listTableNames(HBaseAdmin.java:397)
>> [hbase-client-1.1.2.jar:1.1.2]
>>         at
>> org.apache.nifi.hbase.HBase_1_1_2_ClientService.onEnabled(HBase_1_1_2_ClientService.java:137)
>> [nifi-hbase_1_1_2-client-service-0.4.1.jar:0.4.1]
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> ~[na:1.8.0_71]
>>         at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
>> ~[na:1.8.0_71]
>>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
>> Source) ~[na:1.8.0_71]
>>         at java.lang.reflect.Method.invoke(Unknown Source) ~[na:1.8.0_71]
>>         at
>> org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:102)
>> [nifi-framework-core-0.4.1.jar:0.4.1]
>>         at
>> org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotation(ReflectionUtils.java:47)
>> [nifi-framework-core-0.4.1.jar:0.4.1]
>>         at
>> org.apache.nifi.controller.scheduling.StandardProcessScheduler$6.run(StandardProcessScheduler.java:652)
>> [nifi-framework-core-0.4.1.jar:0.4.1]
>>         at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown
>> Source) [na:1.8.0_71]
>>         at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown
>> Source) [na:1.8.0_71]
>>         at java.lang.Thread.run(Unknown Source) [na:1.8.0_71]
>> Caused by: org.ietf.jgss.GSSException: No valid credentials provided
>> (Mechanism level: Failed to find any Kerberos tgt)
>>         at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Unknown
>> Source) ~[na:1.8.0_71]
>>         at
>> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Unknown Source)
>> ~[na:1.8.0_71]
>>         at
>> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Unknown Source)
>> ~[na:1.8.0_71]
>>         at sun.security.jgss.GSSManagerImpl.getMechanismContext(Unknown
>> Source) ~[na:1.8.0_71]
>>         at sun.security.jgss.GSSContextImpl.initSecContext(Unknown
>> Source) ~[na:1.8.0_71]
>>         at sun.security.jgss.GSSContextImpl.initSecContext(Unknown
>> Source) ~[na:1.8.0_71]
>>         ... 37 common frames omitted
>>
>>
>> Does anyone have any experience with NIFI accessing a Secured HDFS/HBASE
>> setup.
>>
>> I can use the shell on the server fine to go to the hbase shell or use
>> the hadoop fs, so I know kinit works.  But is there more to configure in
>> the nifi.properties?
>>
>> Regards,
>> Laxman
>>
>>
>> ________________________________
>>
>> This e-mail is for the sole use of the intended recipient and contains
>> information that may be privileged and/or confidential. If you are not an
>> intended recipient, please notify the sender by return e-mail and delete
>> this e-mail and any attachments. Certain required legal entity disclosures
>> can be accessed on our website.<
>> http://site.thomsonreuters.com/site/disclosures/>
>>
>
>

Reply via email to