[
https://issues.apache.org/jira/browse/SPARK-7170?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sean Owen updated SPARK-7170:
-----------------------------
Priority: Minor (was: Major)
Affects Version/s: (was: 1.4.0)
> Allow to add register SparkListener specified in SparkConf
> ----------------------------------------------------------
>
> Key: SPARK-7170
> URL: https://issues.apache.org/jira/browse/SPARK-7170
> Project: Spark
> Issue Type: Improvement
> Components: Spark Core
> Affects Versions: 1.3.1
> Reporter: Jacek Lewandowski
> Priority: Minor
>
> Currently if the user wants to add a SparkListener to SparkContext, the
> listener has to be explicitly instantiated and added by calling
> {{SparkContext.addSparkListener}}.
> h5.Problem
> This is actually not a problem but rather an inconvenience in some cases.
> Suppose we want to provide a default listener or listeners, common to all the
> applications in some environment or we want to add a listener to the existing
> applications, which are complied or we do not want to change their code.
> h5.Proposed solution
> We could just specify {{spark.driver.sparkListeners=a.b.c.SomeListener,...}}
> and the referenced class(es) would be instantiated and added as listeners
> automatically when {{SparkContext}} is created. The proposed change does not
> change the API, it just adds a handler for one more property in Spark
> configuration.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]