[
https://issues.apache.org/jira/browse/HBASE-17040?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15645707#comment-15645707
]
Sean Busbey commented on HBASE-17040:
-------------------------------------
(Since this is an unreleased feature, the most expedient help will generally
come from using the support mechanism of the vendor who is publishing it to
their downstream users. For future reference, that's
[community.cloudera.com|https://community.cloudera.com/]. But I'm likely to be
the person answering in any case :joy_cat:. If this had been in an Apache HBase
release, usually user@hbase would be the place to start.)
Can you post the stacktrace for the error using {{\-\-principal}} and
{{\-\-keytab}}? That's what I would expect to be needed for {{\-\-deploy-mode
cluster}} (and what I've previously tested prior to the hbase-spark module as
documented in [the hbase-downstreamer
project|https://github.com/saintstack/hbase-downstreamer#spark-streaming-test-application])
> HBase Spark does not work in Kerberos and yarn-master mode
> ----------------------------------------------------------
>
> Key: HBASE-17040
> URL: https://issues.apache.org/jira/browse/HBASE-17040
> Project: HBase
> Issue Type: Bug
> Components: spark
> Affects Versions: 2.0.0
> Environment: HBase
> Kerberos
> Yarn
> Cloudera
> Reporter: Binzi Cao
>
> We are loading hbase records to RDD with the hbase-spark library in
> Cloudera.
> The hbase-spark code works if we submit the job with client mode, but does
> not work in cluster mode. We got below exceptions:
> ```
> 16/11/07 05:43:28 WARN security.UserGroupInformation:
> PriviledgedActionException as:spark (auth:SIMPLE)
> cause:javax.security.sasl.SaslException: GSS initiate failed [Caused by
> GSSException: No valid credentials provided (Mechanism level: Failed to find
> any Kerberos tgt)]
> 16/11/07 05:43:28 WARN ipc.RpcClientImpl: Exception encountered while
> connecting to the server : javax.security.sasl.SaslException: GSS initiate
> failed [Caused by GSSException: No valid credentials provided (Mechanism
> level: Failed to find any Kerberos tgt)]
> 16/11/07 05:43:28 ERROR ipc.RpcClientImpl: SASL authentication failed. The
> most likely cause is missing or invalid credentials. Consider 'kinit'.
> javax.security.sasl.SaslException: GSS initiate failed [Caused by
> GSSException: No valid credentials provided (Mechanism level: Failed to find
> any Kerberos tgt)]
> at
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
> at
> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:181)
> at
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:617)
> at
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$700(RpcClientImpl.java:162)
> at
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:743)
> at
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:740)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:422)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
> at
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:740)
> at
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:906)
> at
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:873)
> at
> org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1242)
> at
> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:226)
> at
> org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:331)
> at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.execService(ClientProtos.java:34118)
> at
> org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1627)
> at
> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:92)
> at
> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:89)
> at
> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
> at
> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.callExecService(RegionCoprocessorRpcChannel.java:95)
> at
> org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callBlockingMethod(CoprocessorRpcChannel.java:73)
> at
> org.apache.hadoop.hbase.protobuf.generated.AuthenticationProtos$AuthenticationService$BlockingStub.getAuthenticationToken(AuthenticationProtos.java:4512)
> at
> org.apache.hadoop.hbase.security.token.TokenUtil.obtainToken(TokenUtil.java:86)
> at
> org.apache.hadoop.hbase.security.token.TokenUtil$1.run(TokenUtil.java:111)
> at
> org.apache.hadoop.hbase.security.token.TokenUtil$1.run(TokenUtil.java:108)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:422)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
> at
> org.apache.hadoop.hbase.security.User$SecureHadoopUser.runAs(User.java:340)
> at
> org.apache.hadoop.hbase.security.token.TokenUtil.obtainToken(TokenUtil.java:108)
> at
> org.apache.hadoop.hbase.security.token.TokenUtil.addTokenForJob(TokenUtil.java:329)
> at
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.initCredentials(TableMapReduceUtil.java:490)
> at
> org.apache.hadoop.hbase.spark.HBaseContext.<init>(HBaseContext.scala:70)
> ```
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)