Github user harishreedharan commented on the pull request:

    https://github.com/apache/spark/pull/6051#issuecomment-100998671
  
    Have you actually run this one a cluster with the various options:
    * Use --principal + spark.yarn.keytab (and vice versa)
    * Both set and making sure command line gets picked up
    * Only command line and only spark.yarn.*
    
    Also, I have not seen spark.internal.<config> options elsewhere in Spark? 
Are there are other config options like this? I am wondering if it makes sense 
to just re-use the current configs? If the command line params are set, just 
overwrite the ones we read from the configuration?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to