Re: Kerberos setup in Apache spark connecting to remote HDFS/Yarn

2016-06-17 Thread Sudarshan Rangarajan
Hi Ami,

Did you try setting spark.yarn.principal and spark.yarn.keytab as
configuration properties, passing in their corresponding Kerberos values ?

Search for these properties on
http://spark.apache.org/docs/latest/running-on-yarn.html to learn more
about what's expected for them.

Regards,
Sudarshan

On Fri, Jun 17, 2016 at 12:01 PM, akhandeshi <ami.khande...@gmail.com>
wrote:

> Little more progress...
>
> I add few enviornment variables, not I get following error message:
>
>  InvocationTargetException: Can't get Master Kerberos principal for use as
> renewer -> [Help 1]
>
>
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Kerberos-setup-in-Apache-spark-connecting-to-remote-HDFS-Yarn-tp27181p27189.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Re: Kerberos setup in Apache spark connecting to remote HDFS/Yarn

2016-06-17 Thread akhandeshi
Little more progress...

I add few enviornment variables, not I get following error message:

 InvocationTargetException: Can't get Master Kerberos principal for use as
renewer -> [Help 1]






--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Kerberos-setup-in-Apache-spark-connecting-to-remote-HDFS-Yarn-tp27181p27189.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Kerberos setup in Apache spark connecting to remote HDFS/Yarn

2016-06-16 Thread akhandeshi
Rest of the stacktrace.  

WARNING] 
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.codehaus.mojo.exec.ExecJavaMojo$1.run(ExecJavaMojo.java:294)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.IllegalArgumentException: Can't get Kerberos realm
at
org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:65)
at
org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:227)
at
org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:214)
at
org.apache.hadoop.security.UserGroupInformation.isAuthenticationMethodEnabled(UserGroupInformation.java:275)
at
org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:269)
at
org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:820)
at org.apache.spark.examples.SparkYarn$.launchClient(SparkYarn.scala:57)
at org.apache.spark.examples.SparkYarn$.main(SparkYarn.scala:84)
at org.apache.spark.examples.SparkYarn.main(SparkYarn.scala)
... 6 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at
org.apache.hadoop.security.authentication.util.KerberosUtil.getDefaultRealm(KerberosUtil.java:75)
at
org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:63)
... 14 more
Caused by: KrbException: Cannot locate default realm
at sun.security.krb5.Config.getDefaultRealm(Config.java:1029)

I did add krb5.config to classpath as well as define KRB5_CONFIG




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Kerberos-setup-in-Apache-spark-connecting-to-remote-HDFS-Yarn-tp27181p27183.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Kerberos setup in Apache spark connecting to remote HDFS/Yarn

2016-06-16 Thread Ami Khandeshi
Spark 1.6.1; Java 7; Hadoop 2.6

On Thursday, June 16, 2016, Ted Yu <yuzhih...@gmail.com> wrote:

> bq. Caused by: KrbException: Cannot locate default realm
>
> Can you show the rest of the stack trace ?
>
> What versions of Spark / Hadoop are you using ?
>
> Which version of Java are you using (local and in cluster) ?
>
> Thanks
>
> On Thu, Jun 16, 2016 at 6:32 AM, akhandeshi <ami.khande...@gmail.com
> <javascript:_e(%7B%7D,'cvml','ami.khande...@gmail.com');>> wrote:
>
>> I am trying to setup my IDE to a scala spark application.  I want to
>> access
>> HDFS files from remote Hadoop server that has Kerberos enabled.  My
>> understanding is I should be able to do that from Spark.  Here is my code
>> so
>> far:
>>
>> val sparkConf = new SparkConf().setAppName(appName).setMaster(master);
>>
>> if(jars.length>0) {
>> sparkConf.setJars(jars);
>> }
>>
>> if(!properties.isEmpty) {
>> //val iter = properties.keys.iterator
>> for((k,v)<-properties)
>> sparkConf.set(k, v);
>> } else {
>> sparkConf
>> .set("spark.executor.memory", "1024m")
>> .set("spark.cores.max", "1")
>> .set("spark.default.parallelism", "4");
>> }
>>
>> try {
>> if(!StringUtils.isBlank(principal) &&
>> !StringUtils.isBlank(keytab)) {
>> //UserGroupInformation.setConfiguration(config);
>>
>> UserGroupInformation.loginUserFromKeytab(principal, keytab);
>> }
>> } catch  {
>>   case ioe:IOException =>{
>> println("Failed to login to Hadoop [principal = "
>> + principal + ", keytab
>> = " + keytab + "]");
>> ioe.printStackTrace();}
>> }
>>  val sc = new SparkContext(sparkConf)
>>val MY_FILE: String =
>> "hdfs://remoteserver:port/file.out"
>>val rDD = sc.textFile(MY_FILE,10)
>>println("Lines "+rDD.count);
>>
>> I have core-site.xml in my classpath.  I changed hadoop.ssl.enabled to
>> false
>> as it was expecting a secret key.  The principal I am using is correct.  I
>> tried username/_HOST@fully.qualified.domain and
>> username@fully.qualified.domain with no success.  I tried running spark
>> in
>> local mode and yarn client mode.   I am hoping someone has a recipe/solved
>> this problem.  Any pointers to help setup/debug this problem will be
>> helpful.
>>
>> I am getting following error message:
>>
>> Caused by: java.lang.IllegalArgumentException: Can't get Kerberos realm
>> at
>>
>> org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:65)
>> at
>>
>> org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:227)
>> at
>>
>> org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:249)
>> at
>> org.apache.spark.examples.SparkYarn$.launchClient(SparkYarn.scala:55)
>> at org.apache.spark.examples.SparkYarn$.main(SparkYarn.scala:83)
>> at org.apache.spark.examples.SparkYarn.main(SparkYarn.scala)
>> ... 6 more
>> Caused by: java.lang.reflect.InvocationTargetException
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:497)
>> at
>>
>> org.apache.hadoop.security.authentication.util.KerberosUtil.getDefaultRealm(KerberosUtil.java:75)
>> at
>>
>> org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:63)
>> ... 11 more
>> Caused by: KrbException: Cannot locate default realm
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Kerberos-setup-in-Apache-spark-connecting-to-remote-HDFS-Yarn-tp27181.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> <javascript:_e(%7B%7D,'cvml','user-unsubscr...@spark.apache.org');>
>> For additional commands, e-mail: user-h...@spark.apache.org
>> <javascript:_e(%7B%7D,'cvml','user-h...@spark.apache.org');>
>>
>>
>


Re: Kerberos setup in Apache spark connecting to remote HDFS/Yarn

2016-06-16 Thread Ted Yu
bq. Caused by: KrbException: Cannot locate default realm

Can you show the rest of the stack trace ?

What versions of Spark / Hadoop are you using ?

Which version of Java are you using (local and in cluster) ?

Thanks

On Thu, Jun 16, 2016 at 6:32 AM, akhandeshi <ami.khande...@gmail.com> wrote:

> I am trying to setup my IDE to a scala spark application.  I want to access
> HDFS files from remote Hadoop server that has Kerberos enabled.  My
> understanding is I should be able to do that from Spark.  Here is my code
> so
> far:
>
> val sparkConf = new SparkConf().setAppName(appName).setMaster(master);
>
> if(jars.length>0) {
> sparkConf.setJars(jars);
> }
>
> if(!properties.isEmpty) {
> //val iter = properties.keys.iterator
> for((k,v)<-properties)
> sparkConf.set(k, v);
> } else {
> sparkConf
> .set("spark.executor.memory", "1024m")
> .set("spark.cores.max", "1")
> .set("spark.default.parallelism", "4");
> }
>
> try {
> if(!StringUtils.isBlank(principal) &&
> !StringUtils.isBlank(keytab)) {
> //UserGroupInformation.setConfiguration(config);
>
> UserGroupInformation.loginUserFromKeytab(principal, keytab);
> }
> } catch  {
>   case ioe:IOException =>{
> println("Failed to login to Hadoop [principal = "
> + principal + ", keytab
> = " + keytab + "]");
> ioe.printStackTrace();}
> }
>  val sc = new SparkContext(sparkConf)
>val MY_FILE: String =
> "hdfs://remoteserver:port/file.out"
>val rDD = sc.textFile(MY_FILE,10)
>println("Lines "+rDD.count);
>
> I have core-site.xml in my classpath.  I changed hadoop.ssl.enabled to
> false
> as it was expecting a secret key.  The principal I am using is correct.  I
> tried username/_HOST@fully.qualified.domain and
> username@fully.qualified.domain with no success.  I tried running spark in
> local mode and yarn client mode.   I am hoping someone has a recipe/solved
> this problem.  Any pointers to help setup/debug this problem will be
> helpful.
>
> I am getting following error message:
>
> Caused by: java.lang.IllegalArgumentException: Can't get Kerberos realm
> at
>
> org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:65)
> at
>
> org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:227)
> at
>
> org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:249)
> at
> org.apache.spark.examples.SparkYarn$.launchClient(SparkYarn.scala:55)
> at org.apache.spark.examples.SparkYarn$.main(SparkYarn.scala:83)
> at org.apache.spark.examples.SparkYarn.main(SparkYarn.scala)
> ... 6 more
> Caused by: java.lang.reflect.InvocationTargetException
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at
>
> org.apache.hadoop.security.authentication.util.KerberosUtil.getDefaultRealm(KerberosUtil.java:75)
> at
>
> org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:63)
> ... 11 more
> Caused by: KrbException: Cannot locate default realm
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Kerberos-setup-in-Apache-spark-connecting-to-remote-HDFS-Yarn-tp27181.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Kerberos setup in Apache spark connecting to remote HDFS/Yarn

2016-06-16 Thread akhandeshi
I am trying to setup my IDE to a scala spark application.  I want to access
HDFS files from remote Hadoop server that has Kerberos enabled.  My
understanding is I should be able to do that from Spark.  Here is my code so
far:

val sparkConf = new SparkConf().setAppName(appName).setMaster(master);

if(jars.length>0) {
sparkConf.setJars(jars);
}

if(!properties.isEmpty) {
//val iter = properties.keys.iterator
for((k,v)<-properties)
sparkConf.set(k, v);
} else {
sparkConf
.set("spark.executor.memory", "1024m")
.set("spark.cores.max", "1")
.set("spark.default.parallelism", "4");
}

try {
if(!StringUtils.isBlank(principal) && 
!StringUtils.isBlank(keytab)) {
//UserGroupInformation.setConfiguration(config);

UserGroupInformation.loginUserFromKeytab(principal, keytab);
}
} catch  {
  case ioe:IOException =>{
println("Failed to login to Hadoop [principal = " + 
principal + ", keytab
= " + keytab + "]");
ioe.printStackTrace();}
}
 val sc = new SparkContext(sparkConf)
   val MY_FILE: String = "hdfs://remoteserver:port/file.out"
   val rDD = sc.textFile(MY_FILE,10)
   println("Lines "+rDD.count);

I have core-site.xml in my classpath.  I changed hadoop.ssl.enabled to false
as it was expecting a secret key.  The principal I am using is correct.  I
tried username/_HOST@fully.qualified.domain and
username@fully.qualified.domain with no success.  I tried running spark in
local mode and yarn client mode.   I am hoping someone has a recipe/solved
this problem.  Any pointers to help setup/debug this problem will be
helpful.

I am getting following error message:

Caused by: java.lang.IllegalArgumentException: Can't get Kerberos realm
at
org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:65)
at
org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:227)
at
org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:249)
at org.apache.spark.examples.SparkYarn$.launchClient(SparkYarn.scala:55)
at org.apache.spark.examples.SparkYarn$.main(SparkYarn.scala:83)
at org.apache.spark.examples.SparkYarn.main(SparkYarn.scala)
... 6 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at
org.apache.hadoop.security.authentication.util.KerberosUtil.getDefaultRealm(KerberosUtil.java:75)
at
org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:63)
... 11 more
Caused by: KrbException: Cannot locate default realm



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Kerberos-setup-in-Apache-spark-connecting-to-remote-HDFS-Yarn-tp27181.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org