[ 
https://issues.apache.org/jira/browse/SPARK-16067?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15339550#comment-15339550
 ] 

Partha Pratim Ghosh commented on SPARK-16067:
---------------------------------------------

I have an easier question which is related to the same item but does not 
involve JAAS. If a user (user1) is logged in and does a kinit in a unix session 
and starts Spark from there (Spark 1.5) with the properties 
"spark.yarn.principal" as user2 and "spark.yarn.keytab" as user2's keytab, 
shall the Spark context open up? in my experience I am seeing the following 
exception - 

Caused by: 
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException):
 user1 tries to renew a token with renewer user2

Am I missing something?

> Spark overriding JAAS privilege using keytab 
> ---------------------------------------------
>
>                 Key: SPARK-16067
>                 URL: https://issues.apache.org/jira/browse/SPARK-16067
>             Project: Spark
>          Issue Type: Bug
>            Reporter: Partha Pratim Ghosh
>
> I am using a JAAS doAsPrivileged method with kerberos (with keytab) 
> authenticated subject to invoke a Spark configuration but the spark conf is 
> opening with  the Kerberos authentication from system cache instead. I want 
> that to use the authentication from JAAS.
> Following is my JAAS file - 
> public void sparkJaas(){
>               final String principal = "user2";
>               final String keytab = "/app/user2.keytab";
>               /*final String principal = "user1";
>               final String keytab = "/app/user1.keytab";*/
>               final Subject subject;
>               
>               subject = JaasKerbCall.getInstance().login(principal);
>               Subject.doAsPrivileged(subject, new PrivilegedAction<Object>() {
>                       public Object run() {
>                               String classServerUri = "http://<server 
> host>:<server port>";
>                               Set<Principal> principals = 
> subject.getPrincipals();
>                               for (Principal principal : principals) {
>                                       System.out.println("amlpoc : Subject 
> principal" + principal.getName());
>                               }
>                               String sparkBasePath = 
> "/app/spark-1.5.0-bin-hadoop2.6";
>                               File pysparkPath = new File(sparkBasePath, 
> "python" + File.separator + "lib");
>                               File sparkPath = new File(sparkBasePath, "lib");
>                               String[] sparkLibs = new String[] { 
> "spark-assembly-1.5.0-hadoop2.6.0.jar" };
>                               // Open Spark context
>                               SparkConf conf = new 
> SparkConf().setMaster("yarn-client").setAppName("spark-test")
>                                               .set("spark.repl.class.uri", 
> classServerUri);
>                               conf.setSparkHome(sparkBasePath);
>                               
>                               
>                               conf.set("spark.app.name", "spark-test");
>                               conf.set("spark.executor.memory", "8g");
>                               conf.set("spark.scheduler.mode", "FAIR");
>                               conf.set("spark.yarn.principal", principal);
>                               conf.set("spark.yarn.keytab", keytab);
>                               // Only one of py4j-0.9-src.zip and 
> py4j-0.8.2.1-src.zip should exist
>                               String[] pythonLibs = new String[] { 
> "pyspark.zip", "py4j-0.9-src.zip", "py4j-0.8.2.1-src.zip" };
>                               ArrayList<String> pythonLibUris = new 
> ArrayList<String>();
>                               for (String lib : pythonLibs) {
>                                       File libFile = new File(pysparkPath, 
> lib);
>                                       if (libFile.exists()) {
>                                               
> pythonLibUris.add(libFile.toURI().toString());
>                                       }
>                               }
>                               for (String lib : sparkLibs) {
>                                       File libFile = new File(sparkPath, lib);
>                                       if (libFile.exists()) {
>                                               
> pythonLibUris.add(libFile.toURI().toString());
>                                       }
>                               }
>                               pythonLibUris.trimToSize();
>                               // Distribute two libraries(pyspark.zip and 
> py4j-*.zip) to workers
>                               // when spark version is less than or equal to 
> 1.4.1
>                               if (pythonLibUris.size() == 2) {
>                                       try {
>                                               String confValue = 
> conf.get("spark.yarn.dist.files");
>                                               
> conf.set("spark.yarn.dist.files", confValue + "," + 
> Joiner.on(",").join(pythonLibUris));
>                                       } catch (NoSuchElementException e) {
>                                               
> conf.set("spark.yarn.dist.files", Joiner.on(",").join(pythonLibUris));
>                                       }
>                                       conf.set("spark.files", 
> conf.get("spark.yarn.dist.files"));
>                                       conf.set("spark.submit.pyArchives", 
> Joiner.on(":").join(pythonLibs));
>                               }
>                               conf.set("spark.yarn.isPython", "true");
>                               SparkContext sparkContext = new 
> SparkContext(conf);
>                               System.out.println("SparkContext created : 
> AppId : " + sparkContext.getConf().getAppId());
>                               return sparkContext;
>                       }// End run()
>               }, null);
>       }
> Following is the kerberos log - 
> [INFO] 
> [INFO] --- exec-maven-plugin:1.4.0:java (default-cli) @ spark-connectivity ---
> >>> KeyTabInputStream, readName(): XX.XX.XX.XX
> >>> KeyTabInputStream, readName(): user2
> >>> KeyTab: load() entry length: 81; type: 18
> >>> KeyTabInputStream, readName(): XX.XX.XX.XX
> >>> KeyTabInputStream, readName(): user2
> >>> KeyTab: load() entry length: 65; type: 17
> >>> KeyTabInputStream, readName(): XX.XX.XX.XX
> >>> KeyTabInputStream, readName(): user2
> >>> KeyTab: load() entry length: 65; type: 17
> >>> KeyTabInputStream, readName(): XX.XX.XX.XX
> >>> KeyTabInputStream, readName(): user2
> >>> KeyTab: load() entry length: 65; type: 17
> Looking for keys for: [email protected]
> Java config name: /app/java/spark-connectivity/src/main/resources/krb5.conf
> Loaded from Java config
> Added key: 17version: 1
> Added key: 17version: 1
> Added key: 17version: 1
> Added key: 18version: 1
> >>> KdcAccessibility: reset
> Looking for keys for: [email protected]
> Added key: 17version: 1
> Added key: 17version: 1
> Added key: 17version: 1
> Added key: 18version: 1
> default etypes for default_tkt_enctypes: 17 16 23.
> >>> KrbAsReq creating message
> >>> KrbKdcReq send: kdc=kdcs2-yy.yy.yy.yy UDP:88, timeout=30000, number of 
> >>> retries =3, #bytes=160
> >>> KDCCommunication: kdc=kdcs2-yy.yy.yy.yy UDP:88, timeout=30000,Attempt =1, 
> >>> #bytes=160
> >>> KrbKdcReq send: #bytes read=255
> >>>Pre-Authentication Data:
>          PA-DATA type = 2
>          PA-ENC-TIMESTAMP
> >>>Pre-Authentication Data:
>          PA-DATA type = 19
>          PA-ETYPE-INFO2 etype = 17, salt = <salt>, s2kparams = null
> >>>Pre-Authentication Data:
>          PA-DATA type = 13
> >>> KdcAccessibility: remove kdcs2-yy.yy.yy.yy:88
> >>> KDCRep: init() encoding tag is 126 req type is 11
> >>>KRBError:
>          cTime is Wed Mar 30 13:18:20 EDT 2022 1648660700000
>          sTime is Mon Jun 20 07:57:20 EDT 2016 1466423840000
>          suSec is 762837
>          error code is 25
>          error Message is Additional pre-authentication required
>          cname is [email protected]
>          sname is <server>/[email protected]
>          eData provided.
>          msgType is 30
> >>>Pre-Authentication Data:
>          PA-DATA type = 2
>          PA-ENC-TIMESTAMP
> >>>Pre-Authentication Data:
>          PA-DATA type = 19
>          PA-ETYPE-INFO2 etype = 17, salt = <salt>, s2kparams = null
> >>>Pre-Authentication Data:
>          PA-DATA type = 13
> KRBError received: NEEDED_PREAUTH
> KrbAsReqBuilder: PREAUTH FAILED/REQ, re-send AS-REQ
> default etypes for default_tkt_enctypes: 17 16 23.
> Looking for keys for: [email protected]
> Added key: 17version: 1
> Added key: 17version: 1
> Added key: 17version: 1
> Added key: 18version: 1
> Looking for keys for: [email protected]
> Added key: 17version: 1
> Added key: 17version: 1
> Added key: 17version: 1
> Added key: 18version: 1
> default etypes for default_tkt_enctypes: 17 16 23.
> >>> EType: sun.security.krb5.internal.crypto.<CryptoType>
> >>> KrbAsReq creating message
> >>> KrbKdcReq send: kdc=kdcs2-yy.yy.yy.yy UDP:88, timeout=30000, number of 
> >>> retries =3, #bytes=247
> >>> KDCCommunication: kdc=kdcs2-yy.yy.yy.yy UDP:88, timeout=30000,Attempt =1, 
> >>> #bytes=247
> >>> KrbKdcReq send: #bytes read=632
> >>> KdcAccessibility: remove kdcs2-yy.yy.yy.yy:88
> Looking for keys for: [email protected]
> Added key: 17version: 1
> Added key: 17version: 1
> Added key: 17version: 1
> Added key: 18version: 1
> >>> EType: sun.security.krb5.internal.crypto.<CryptoType>
> >>> KrbAsRep cons in KrbAsReq.getReply user2
> Authentication succeeded!
> <projName> : Subject [email protected]
> Using Spark's default log4j profile: 
> org/apache/spark/log4j-defaults.properties
> 16/06/20 07:57:21 INFO SparkContext: Running Spark version 1.5.0
> >>>KinitOptions cache name is /tmp/krb5cc_515
> >>>DEBUG <CCacheInputStream>  client principal is [email protected]
> Why is spark checking ticket cache when JAAS is providing keytab 
> authentication?



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to