I have no idea what the right way to solve it is, but this is a Kerberos error: the cluster is expecting you to have a Kerberos ticket-granting ticket ('tgt') but you haven't got one.

Its suggestion of using 'kinit' is pointing you towards a way of getting such a ticket: 'kinit' is the Linux command for starting a Kerberos session and retrieving a ticket. But to use it, you need to have the right Kerberos config on the client.

James

On 07/12/15 19:54, Akhilesh Pathodia wrote:
Hi,

I am running spark job on yarn in cluster mode in secured cluster. I am trying to run Spark on Hbase using Phoenix, but Spark executors are unable to get hbase connection using phoenix. I am running knit command to get the ticket before starting the job and also keytab file and principal are correctly specified in connection URL. But still spark job on each node throws below error:

15/12/01 03:23:15 ERROR ipc.AbstractRpcClient: SASL authentication failed. The most likely cause is missing or invalid credentials. Consider 'kinit'. javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)

I am using Spark 1.3.1, Hbase 1.0.0, Phoenix 4.3. I am able to run Spark on Hbase(without phoenix) successfully in yarn-client mode as mentioned in this link:

https://github.com/cloudera-labs/SparkOnHBase#scan-that-works-on-kerberos

Also, I found that there is a known issue for yarn-cluster mode for Spark 1.3.1 version:

https://issues.apache.org/jira/browse/SPARK-6918

Has anybody been successful in running Spark on hbase using Phoenix in yarn cluster or client mode?

Thanks,
Akhilesh Pathodia

Reply via email to