Github user mgummelt commented on the issue:

    https://github.com/apache/spark/pull/14650
  
    I'm generally fine with this, though one downside is that it introduces a 
consistency with other daemon classes such as Master.scala, which only accepts 
a properties file.  Maybe we should make a JIRA to add this to the other 
classes.
    
    Not that Spark configuration has any sort of sane consistency to be upheld. 
 There are ~10 ways of setting config properties.  It's one of the most 
confusing things about operations.
    
    @srowen Do you know why even have a separate set of Spark config 
properties, rather than just using Java System properties?  I know you can load 
Spark conf from Java properties, but you can also load them via `--conf` (and 
only in spark-submit), which seems like an unnecessary nonstandard interface.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to