[ 
https://issues.apache.org/jira/browse/SPARK-7786?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14594240#comment-14594240
 ] 

Tathagata Das commented on SPARK-7786:
--------------------------------------

[~397090770] This functionality can be easily done by the user code without 
actually loosing events rather than having the functionality as a SparkConf. 
The user can very easily pass the name of the class by whatever means (cmdline 
args, etc.) in the process and the user can use reflection to instantiate the 
right listener and attach it to the streaming context before starting it. 

The reason similar functionality was added for SparkListener because attaching 
any listener after the SparkContext has been initialized will not catch all the 
initial events. So the system needs to attach any listener before any event has 
been generated, and that why SparkConf config was necessary. However this is 
not the case for StreamingListener as there are no events before starting the 
StreamingContext.

So can you elaborate on scenarios where this is absolutely essential?

> Allow StreamingListener to be specified in SparkConf and loaded when creating 
> StreamingContext
> ----------------------------------------------------------------------------------------------
>
>                 Key: SPARK-7786
>                 URL: https://issues.apache.org/jira/browse/SPARK-7786
>             Project: Spark
>          Issue Type: New Feature
>          Components: Streaming
>    Affects Versions: 1.3.1
>            Reporter: yangping wu
>            Priority: Minor
>
> As  mentioned in 
> [SPARK-5411|https://issues.apache.org/jira/browse/SPARK-5411], We can also 
> allow user to register StreamingListener  through SparkConf settings, and 
> loaded when creating StreamingContext, This would allow monitoring frameworks 
> to be easily injected into Spark programs without having to modify those 
> programs' code.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to