GitHub user Ngone51 opened a pull request:

    https://github.com/apache/spark/pull/20053

    Init lastReportTimestamp with system current time when start() called…

    … in AsyncEventQueue
    
    ## What changes were proposed in this pull request?
    ```
     if (droppedEventsCounter.compareAndSet(droppedCount, 0)) {
          val prevLastReportTimestamp = lastReportTimestamp
          lastReportTimestamp = System.currentTimeMillis()
          val previous = new java.util.Date(prevLastReportTimestamp)
          logWarning(s"Dropped $droppedEvents events from $name since 
$previous.")
       }
    ```
    First time we log previous date, it would be "Thu Jan 01 08:00:00 CST 1970" 
because of lastReportTimestamp was inited by 0L. Although there is no mistake 
in theory, AsyncEventQueue's starting time seems better:
    ```
    private[scheduler] def start(sc: SparkContext): Unit = {
        if (started.compareAndSet(false, true)) {
          this.sc = sc
          lastReportTimestamp = System.currentTimeMillis()
          dispatchThread.start()
        } else {
          throw new IllegalStateException(s"$name already started!")
        }
      }
    
    ```
    ## How was this patch tested?
    
    manual test:
    Debug unit test '  test("metrics for dropped listener events") ' in 
SparkListenerSuit to check the previous date first time we log for drop events.
    


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/Ngone51/spark SPARK-22873

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/20053.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #20053
    
----
commit 64facd95611ec1a5f3a51b456ff8e48ecefa9a93
Author: wuyi <ngone_5451@...>
Date:   2017-12-22T06:25:36Z

    Init lastReportTimestamp with system current time when start() called in 
AsyncEventQueue

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to