Spark Streaming application failing with Token issue

2016-08-18 Thread Kamesh
Hi all,

 I am running a spark streaming application that store events into
Secure(Kerborized) HBase cluster. I launched this spark streaming
application by passing --principal and --keytab. Despite this, spark
streaming application is failing after *7days* with Token issue. Can
someone please suggest how to fix this.

*Error Message*

16/08/18 02:39:45 WARN ipc.AbstractRpcClient: Exception encountered while
connecting to the server :
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.token.SecretManager$InvalidToken):
Token has expired

16/08/18 02:39:45 WARN security.UserGroupInformation:
PriviledgedActionException as:sys_bio_replicator (auth:KERBEROS)
cause:org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.token.SecretManager$InvalidToken):
Token has expired

*Environment*

Spark Version : 1.6.1

HBase version : 1.0.0

Hadoop Version : 2.6.0


Thanks & Regards
Kamesh.


Re: Spark Streaming application failing with Kerboros issue while writing data to HBase

2016-06-14 Thread Kamesh
Thanks Ted.

Thanks & Regards
Kamesh.

On Mon, Jun 13, 2016 at 10:48 PM, Ted Yu <yuzhih...@gmail.com> wrote:

> Can you show snippet of your code, please ?
>
> Please refer to obtainTokenForHBase() in
> yarn/src/main/scala/org/apache/spark/deploy/yarn/YarnSparkHadoopUtil.scala
>
> Cheers
>
> On Mon, Jun 13, 2016 at 4:44 AM, Kamesh <kam.iit...@gmail.com> wrote:
>
>> Hi All,
>>  We are building a spark streaming application and that application
>> writes data to HBase table. But writes/reads are failing with following
>> exception
>>
>> 16/06/13 04:35:16 ERROR ipc.AbstractRpcClient: SASL authentication
>> failed. The most likely cause is missing or invalid credentials. Consider
>> 'kinit'.
>>
>> javax.security.sasl.SaslException: GSS initiate failed [Caused by
>> GSSException: No valid credentials provided (Mechanism level: Failed to
>> find any Kerberos tgt)]
>>
>> at
>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
>>
>> at
>> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)
>>
>> at
>> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:605)
>>
>> This application is failing at Executor machine. Executor is not able to
>> pass the token. Can someone help me how to resolve this issue.
>>
>> *Environment Details*
>> Spark Version : 1.6.1
>> HBase Version : 1.0.0
>> Hadoop Version : 2.6.0
>>
>> --
>> Thanks & Regards
>> Kamesh.
>>
>
>


Spark Streaming application failing with Kerboros issue while writing data to HBase

2016-06-13 Thread Kamesh
Hi All,
 We are building a spark streaming application and that application writes
data to HBase table. But writes/reads are failing with following exception

16/06/13 04:35:16 ERROR ipc.AbstractRpcClient: SASL authentication failed.
The most likely cause is missing or invalid credentials. Consider 'kinit'.

javax.security.sasl.SaslException: GSS initiate failed [Caused by
GSSException: No valid credentials provided (Mechanism level: Failed to
find any Kerberos tgt)]

at
com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)

at
org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)

at
org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:605)

This application is failing at Executor machine. Executor is not able to
pass the token. Can someone help me how to resolve this issue.

*Environment Details*
Spark Version : 1.6.1
HBase Version : 1.0.0
Hadoop Version : 2.6.0

--
Thanks & Regards
Kamesh.


exception while running pi example on yarn cluster

2014-03-08 Thread Venkata siva kamesh Bhallamudi
Hi All,
 I am new to Spark and running pi example on Yarn Cluster. I am getting the
following exception

Exception in thread main java.lang.NullPointerException
at
scala.collection.mutable.ArrayOps$ofRef$.length$extension(ArrayOps.scala:114)
at scala.collection.mutable.ArrayOps$ofRef.length(ArrayOps.scala:114)
at
scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:32)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
at
org.apache.spark.deploy.yarn.Client$.populateClasspath(Client.scala:518)
at org.apache.spark.deploy.yarn.Client.setupLaunchEnv(Client.scala:333)
at org.apache.spark.deploy.yarn.Client.runApp(Client.scala:94)
at org.apache.spark.deploy.yarn.Client.run(Client.scala:115)
at org.apache.spark.deploy.yarn.Client$.main(Client.scala:492)
at org.apache.spark.deploy.yarn.Client.main(Client.scala)

I am using
Spark Version : 0.9.0
Yarn Version : 2.3.0

Please help me where am I doing wrong.

Thanks  Regards
Kamesh.