Github user eyalfa commented on a diff in the pull request:
https://github.com/apache/spark/pull/21369#discussion_r192631230
--- Diff:
core/src/test/scala/org/apache/spark/util/collection/ExternalAppendOnlyMapSuite.scala
---
@@ -414,7 +415,106 @@ class ExternalAppendOnlyMapSuite extends
SparkFunSuite with LocalSparkContext {
sc.stop()
}
- test("external aggregation updates peak execution memory") {
+ test("SPARK-22713 spill during iteration leaks internal map") {
+ val size = 1000
+ val conf = createSparkConf(loadDefaults = true)
+ sc = new SparkContext("local-cluster[1,1,1024]", "test", conf)
+ val map = createExternalMap[Int]
--- End diff --
@cloud-fan , can we move on with this?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]