Github user pwendell commented on the pull request:

    https://github.com/apache/spark/pull/4111#issuecomment-70732807
  
    Hey so for this one, I think it might be good to use a configuration 
mechanism rather than exposing a different constructor. The issue with this 
approach is that users have to re-write their code to use a different 
constructor in order to use this, whereas many applications might be written 
using the standard SparkContext constructor.
    
    What about having a configuration option that allows you to specify classes 
of additional listeners, and then we expect the listener has a constructor that 
accepts a SparkConf.
    
    ```
    spark.extraListeners=a.b.C,d.e.F
    ```
    
    Then when we construct a SparkContext we instantiate these and with the 
SparkConf and we attach them to the context.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to