Github user ksakellis commented on the pull request:

    https://github.com/apache/spark/pull/4111#issuecomment-70737041
  
    @pwendell I think using configuration to add spark listeners adds 
complexity to the system. 
    1) Can't do static checking of listeners at compile time
    2) The config option is not self describing, users will need to reference 
docs to know how to add these listeners.
    3) Listeners may not need to accept SparkConf so now  you need to change 
your Listener implementation. 
    
    @vanzin had the idea, which I like, of creating a SparkContextBuilider 
object that would make construction easier than adding more constructor 
arguments. So you can imagine something like:
    ```scala
    val context = SparkContextBuilder.addConfig(confg)
                                     .addListeners(...)
                                     .setSparkHome(..)
                                     .build()
    ```
    We can obviously keep supporting the existing constructors for 
compatibility but move towards this for new constructor arguments. Just an Idea.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to