Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21369#discussion_r190583904
  
    --- Diff: 
core/src/test/scala/org/apache/spark/util/collection/ExternalAppendOnlyMapSuite.scala
 ---
    @@ -414,7 +415,106 @@ class ExternalAppendOnlyMapSuite extends 
SparkFunSuite with LocalSparkContext {
         sc.stop()
       }
     
    -  test("external aggregation updates peak execution memory") {
    +  test("SPARK-22713 spill during iteration leaks internal map") {
    +    val size = 1000
    +    val conf = createSparkConf(loadDefaults = true)
    +    sc = new SparkContext("local-cluster[1,1,1024]", "test", conf)
    +    val map = createExternalMap[Int]
    +
    +    map.insertAll((0 until size).iterator.map(i => (i / 10, i)))
    +    assert(map.numSpills == 0, "map was not supposed to spill")
    +
    +    val it = map.iterator
    +    assert(it.isInstanceOf[CompletionIterator[_, _]])
    +    val underlyingIt = map.readingIterator
    +    assert( underlyingIt != null )
    --- End diff --
    
    `assert(underlyingIt != null)`, we should not put space around. can you fix 
all of them?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to