Repository: spark
Updated Branches:
  refs/heads/master 310632498 -> 49a1993b1


[SPARK-25163][SQL] Fix flaky test: 
o.a.s.util.collection.ExternalAppendOnlyMapSuiteCheck

## What changes were proposed in this pull request?

`ExternalAppendOnlyMapSuiteCheck` test is flaky.

We use a `SparkListener` to collect spill metrics of completed stages. 
`withListener` runs the code that does spill. Spill status was checked after 
the code finishes but it was still in `withListener`. At that time it was 
possibly not all events to the listener bus are processed.

We should check spill status after all events are processed.

## How was this patch tested?

Locally ran unit tests.

Closes #22181 from viirya/SPARK-25163.

Authored-by: Liang-Chi Hsieh <vii...@gmail.com>
Signed-off-by: Shixiong Zhu <zsxw...@gmail.com>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/49a1993b
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/49a1993b
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/49a1993b

Branch: refs/heads/master
Commit: 49a1993b168accb6f188c682546f12ea568173c4
Parents: 3106324
Author: Liang-Chi Hsieh <vii...@gmail.com>
Authored: Wed Aug 22 14:17:05 2018 -0700
Committer: Shixiong Zhu <zsxw...@gmail.com>
Committed: Wed Aug 22 14:17:05 2018 -0700

----------------------------------------------------------------------
 core/src/main/scala/org/apache/spark/TestUtils.scala | 10 ++++++----
 1 file changed, 6 insertions(+), 4 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/49a1993b/core/src/main/scala/org/apache/spark/TestUtils.scala
----------------------------------------------------------------------
diff --git a/core/src/main/scala/org/apache/spark/TestUtils.scala 
b/core/src/main/scala/org/apache/spark/TestUtils.scala
index 6cc8fe1..c2ebd38 100644
--- a/core/src/main/scala/org/apache/spark/TestUtils.scala
+++ b/core/src/main/scala/org/apache/spark/TestUtils.scala
@@ -173,10 +173,11 @@ private[spark] object TestUtils {
    * Run some code involving jobs submitted to the given context and assert 
that the jobs spilled.
    */
   def assertSpilled(sc: SparkContext, identifier: String)(body: => Unit): Unit 
= {
-    withListener(sc, new SpillListener) { listener =>
+    val listener = new SpillListener
+    withListener(sc, listener) { _ =>
       body
-      assert(listener.numSpilledStages > 0, s"expected $identifier to spill, 
but did not")
     }
+    assert(listener.numSpilledStages > 0, s"expected $identifier to spill, but 
did not")
   }
 
   /**
@@ -184,10 +185,11 @@ private[spark] object TestUtils {
    * did not spill.
    */
   def assertNotSpilled(sc: SparkContext, identifier: String)(body: => Unit): 
Unit = {
-    withListener(sc, new SpillListener) { listener =>
+    val listener = new SpillListener
+    withListener(sc, listener) { _ =>
       body
-      assert(listener.numSpilledStages == 0, s"expected $identifier to not 
spill, but did")
     }
+    assert(listener.numSpilledStages == 0, s"expected $identifier to not 
spill, but did")
   }
 
   /**


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to