[ 
https://issues.apache.org/jira/browse/SPARK-7951?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen resolved SPARK-7951.
------------------------------
    Resolution: Invalid

I don't think this clearly describes a problem, and sounds like a question for 
user@ first. Please see 
https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark

> Events Missing with custom sparkListener
> ----------------------------------------
>
>                 Key: SPARK-7951
>                 URL: https://issues.apache.org/jira/browse/SPARK-7951
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.3.0
>            Reporter: Anubhav Srivastava
>              Labels: patch
>   Original Estimate: 48h
>  Remaining Estimate: 48h
>
> If we try to add our own SparkListener, we miss some of the events. I went 
> through the code for spark-core, figured out that initial events like 
> onApplicationStart() etc are fired when SparkContext() is initialised. 
> Default eventListeners are getting those events as they are added while the 
> sparkContext object is created, but when we add our own listener by calling 
> sc().addListener(), while firing initial events our listener is not present 
> at that time so its not getting those events.
>  In some cases you may be able to get those events as when asyncListenerBus 
> is created, it spawns a new thread so in case the execution of that thread 
> takes time, our listener might get added by that time so we may receive those 
> events but that's not fair.
> I Can fix it. In-fact the fix is working fine for me. But just wanted to know 
> is that done purposely?? If yes then what's the actual reason???



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to