GitHub user jerryshao opened a pull request:

    https://github.com/apache/spark/pull/16923

    [SPARK-19038][Hive][YARN] Correctly figure out keytab file name in yarn 
client mode

    Change-Id: I06170769f83fe530361a2737427b46d657f40d75
    
    ## What changes were proposed in this pull request?
    
    Because yarn#client will reset the `spark.yarn.keytab` configuration to 
point to the location in distributed file, so if user still uses the old 
`SparkConf` to create `SparkSession` with Hive enabled, it will read keytab 
from the path in distributed cached. This is OK for yarn cluster mode, but in 
yarn client mode where driver is running out of container, it will be failed to 
fetch the keytab.
    
    This is not a bug in Spark, but user will easily run into this issue if 
they're not aware of this tricky thing. So instead of letting user to create a 
new `SparkConf` to get original keytab file path, here put this logic into 
Spark to avoid it.
    
    ## How was this patch tested?
    
    Verified in security cluster.


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/jerryshao/apache-spark SPARK-19038

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/16923.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #16923
    
----
commit 7ed7c6c824c6b3ae5cf7a19edec37548adb1a2cc
Author: jerryshao <ss...@hortonworks.com>
Date:   2017-02-14T09:00:24Z

    Correctly figure out keytab file name in yarn client mode
    
    Change-Id: I06170769f83fe530361a2737427b46d657f40d75

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to