Repository: spark
Updated Branches:
  refs/heads/branch-2.1 207107bca -> d5c4a5d06


[SPARK-18840][YARN] Avoid throw exception when getting token renewal interval 
in non HDFS security environment

## What changes were proposed in this pull request?

Fix `java.util.NoSuchElementException` when running Spark in non-hdfs security 
environment.

In the current code, we assume `HDFS_DELEGATION_KIND` token will be found in 
Credentials. But in some cloud environments, HDFS is not required, so we should 
avoid this exception.

## How was this patch tested?

Manually verified in local environment.

Author: jerryshao <[email protected]>

Closes #16265 from jerryshao/SPARK-18840.

(cherry picked from commit 43298d157d58d5d03ffab818f8cdfc6eac783c55)
Signed-off-by: Marcelo Vanzin <[email protected]>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/d5c4a5d0
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/d5c4a5d0
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/d5c4a5d0

Branch: refs/heads/branch-2.1
Commit: d5c4a5d06b3282aec8300d27510393161773061b
Parents: 207107b
Author: jerryshao <[email protected]>
Authored: Tue Dec 13 10:37:45 2016 -0800
Committer: Marcelo Vanzin <[email protected]>
Committed: Tue Dec 13 10:37:56 2016 -0800

----------------------------------------------------------------------
 .../yarn/security/HDFSCredentialProvider.scala  | 21 ++++++++++----------
 1 file changed, 11 insertions(+), 10 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/d5c4a5d0/yarn/src/main/scala/org/apache/spark/deploy/yarn/security/HDFSCredentialProvider.scala
----------------------------------------------------------------------
diff --git 
a/yarn/src/main/scala/org/apache/spark/deploy/yarn/security/HDFSCredentialProvider.scala
 
b/yarn/src/main/scala/org/apache/spark/deploy/yarn/security/HDFSCredentialProvider.scala
index 8d06d73..ebb176b 100644
--- 
a/yarn/src/main/scala/org/apache/spark/deploy/yarn/security/HDFSCredentialProvider.scala
+++ 
b/yarn/src/main/scala/org/apache/spark/deploy/yarn/security/HDFSCredentialProvider.scala
@@ -72,21 +72,22 @@ private[security] class HDFSCredentialProvider extends 
ServiceCredentialProvider
     // We cannot use the tokens generated with renewer yarn. Trying to renew
     // those will fail with an access control issue. So create new tokens with 
the logged in
     // user as renewer.
-    sparkConf.get(PRINCIPAL).map { renewer =>
+    sparkConf.get(PRINCIPAL).flatMap { renewer =>
       val creds = new Credentials()
       nnsToAccess(hadoopConf, sparkConf).foreach { dst =>
         val dstFs = dst.getFileSystem(hadoopConf)
         dstFs.addDelegationTokens(renewer, creds)
       }
-      val t = creds.getAllTokens.asScala
-        .filter(_.getKind == DelegationTokenIdentifier.HDFS_DELEGATION_KIND)
-        .head
-      val newExpiration = t.renew(hadoopConf)
-      val identifier = new DelegationTokenIdentifier()
-      identifier.readFields(new DataInputStream(new 
ByteArrayInputStream(t.getIdentifier)))
-      val interval = newExpiration - identifier.getIssueDate
-      logInfo(s"Renewal Interval is $interval")
-      interval
+      val hdfsToken = creds.getAllTokens.asScala
+        .find(_.getKind == DelegationTokenIdentifier.HDFS_DELEGATION_KIND)
+      hdfsToken.map { t =>
+        val newExpiration = t.renew(hadoopConf)
+        val identifier = new DelegationTokenIdentifier()
+        identifier.readFields(new DataInputStream(new 
ByteArrayInputStream(t.getIdentifier)))
+        val interval = newExpiration - identifier.getIssueDate
+        logInfo(s"Renewal Interval is $interval")
+        interval
+      }
     }
   }
 


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to