Your Kerberos cert is likely expiring. Check your expiration settings.

-Ilya Ganelin

On Mon, Nov 16, 2015 at 9:20 PM, Vipul Rai <vipulrai8...@gmail.com> wrote:

> Hi Nikhil,
> It seems you have Kerberos enabled cluster and it is unable to
> authenticate using the ticket.
> Please check the Kerberos settings, it could also be because of Kerberos
> version mismatch on nodes.
>
> Thanks,
> Vipul
>
> On Tue 17 Nov, 2015 07:31 Nikhil Gs <gsnikhil1432...@gmail.com> wrote:
>
>> Hello Team,
>>
>> Below is the error which we are facing in our cluster after 14 hours of
>> starting the spark submit job. Not able to understand the issue and why its
>> facing the below error after certain time.
>>
>> If any of you have faced the same scenario or if you have any idea then
>> please guide us. To identify the issue, if you need any other info then
>> please revert me back with the requirement.Thanks a lot in advance.
>>
>> *Log Error:  *
>>
>> 15/11/16 04:54:48 ERROR ipc.AbstractRpcClient: SASL authentication
>> failed. The most likely cause is missing or invalid credentials. Consider
>> 'kinit'.
>>
>> javax.security.sasl.SaslException: *GSS initiate failed [Caused by
>> GSSException: No valid credentials provided (Mechanism level: Failed to
>> find any Kerberos tgt)]*
>>
>>                 at
>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
>>
>>                 at
>> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)
>>
>>                 at
>> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:605)
>>
>>                 at
>> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$600(RpcClientImpl.java:154)
>>
>>                 at
>> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:731)
>>
>>                 at
>> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:728)
>>
>>                 at java.security.AccessController.doPrivileged(Native
>> Method)
>>
>>                 at javax.security.auth.Subject.doAs(Subject.java:415)
>>
>>                 at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
>>
>>                 at
>> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:728)
>>
>>                 at
>> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:881)
>>
>>                 at
>> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:850)
>>
>>                 at
>> org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1174)
>>
>>                 at
>> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:216)
>>
>>                 at
>> org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:300)
>>
>>                 at
>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.get(ClientProtos.java:31865)
>>
>>                 at
>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRowOrBefore(ProtobufUtil.java:1580)
>>
>>                 at
>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1294)
>>
>>                 at
>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1126)
>>
>>                 at
>> org.apache.hadoop.hbase.client.AsyncProcess.submit(AsyncProcess.java:369)
>>
>>                 at
>> org.apache.hadoop.hbase.client.AsyncProcess.submit(AsyncProcess.java:320)
>>
>>                 at
>> org.apache.hadoop.hbase.client.BufferedMutatorImpl.backgroundFlushCommits(BufferedMutatorImpl.java:206)
>>
>>                 at
>> org.apache.hadoop.hbase.client.BufferedMutatorImpl.flush(BufferedMutatorImpl.java:183)
>>
>>                 at
>> org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:1482)
>>
>>                 at
>> org.apache.hadoop.hbase.client.HTable.put(HTable.java:1095)
>>
>>                 at
>> com.suxxxxxk.bigdata.pulse.consumer.ModempollHbaseLoadHelper$1.run(ModempollHbaseLoadHelper.java:89)
>>
>>                 at java.security.AccessController.doPrivileged(Native
>> Method)
>>
>>                 at javax.security.auth.Subject.doAs(Subject.java:356)
>>
>>                 at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1651)
>>
>>                 at
>> com.suxxxxxk.bigdata.pulse.consumer.ModempollHbaseLoadHelper.loadToHbase(ModempollHbaseLoadHelper.java:48)
>>
>>                 at
>> com.suxxxxxk.bigdata.pulse.consumer.ModempollSparkStreamingEngine$1.call(ModempollSparkStreamingEngine.java:52)
>>
>>                 at
>> com.suxxxxxk.bigdata.pulse.consumer.ModempollSparkStreamingEngine$1.call(ModempollSparkStreamingEngine.java:48)
>>
>>                 at
>> org.apache.spark.api.java.JavaPairRDD$$anonfun$toScalaFunction$1.apply(JavaPairRDD.scala:999)
>>
>>                 at
>> scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
>>
>>                 at
>> scala.collection.Iterator$$anon$10.next(Iterator.scala:312)
>>
>>                 at
>> scala.collection.Iterator$class.foreach(Iterator.scala:727)
>>
>>                 at
>> scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
>>
>>                 at
>> scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
>>
>>                 at
>> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103)
>>
>>                 at
>> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47)
>>
>>                 at scala.collection.TraversableOnce$class.to
>> (TraversableOnce.scala:273)
>>
>>                 at scala.collection.AbstractIterator.to
>> (Iterator.scala:1157)
>>
>>                 at
>> scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265)
>>
>>                 at
>> scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157)
>>
>>                 at
>> scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252)
>>
>>                 at
>> scala.collection.AbstractIterator.toArray(Iterator.scala:1157)
>>
>>                 at
>> org.apache.spark.rdd.RDD$$anonfun$33.apply(RDD.scala:1177)
>>
>>                 at
>> org.apache.spark.rdd.RDD$$anonfun$33.apply(RDD.scala:1177)
>>
>>                 at
>> org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1503)
>>
>>                 at
>> org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1503)
>>
>>                 at
>> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
>>
>>                 at org.apache.spark.scheduler.Task.run(Task.scala:64)
>>
>>                 at
>> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)
>>
>>                 at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>
>>                 at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>
>>                 at java.lang.Thread.run(Thread.java:745)
>>
>> Caused by: GSSException: No valid credentials provided (Mechanism level:
>> Failed to find any Kerberos tgt)
>>
>>                 at
>> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
>>
>>                 at
>> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
>>
>>                 at
>> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
>>
>>                 at
>> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
>>
>>                 at
>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
>>
>>                 at
>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
>>
>>                 at
>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
>>
>>                 ... 55 more
>>
>> 15/11/16 04:54:50 WARN security.UserGroupInformation:
>> PriviledgedActionException as:s_sdldalplhdxxx...@sxxxxxxxk.com
>> (auth:KERBEROS) cause:javax.security.sasl.SaslException: GSS initiate
>> failed [Caused by GSSException: No valid credentials provided (Mechanism
>> level: Failed to find any Kerberos tgt)]
>>
>> 15/11/16 04:54:50 WARN ipc.AbstractRpcClient: Exception encountered while
>> connecting to the server : javax.security.sasl.SaslException: GSS initiate
>> failed [Caused by GSSException: No valid credentials provided (Mechanism
>> level: Failed to find any Kerberos tgt)]
>>
>> 15/11/16 04:54:50 ERROR ipc.AbstractRpcClient: SASL authentication
>> failed. The most likely cause is missing or invalid credentials. Consider
>> 'kinit'.
>>
>> javax.security.sasl.SaslException: GSS initiate failed [Caused by
>> GSSException: No valid credentials provided (Mechanism level: Failed to
>> find any Kerberos tgt)]
>>
>>
>>
>> Thanks!
>>
>

Reply via email to