Github user eyalfa commented on a diff in the pull request:
https://github.com/apache/spark/pull/19181#discussion_r138751839
--- Diff:
core/src/test/java/org/apache/spark/util/collection/unsafe/sort/UnsafeExternalSorterSuite.java
---
@@ -503,6 +511,47 @@ public void testGetIterator() throws Exception {
verifyIntIterator(sorter.getIterator(279), 279, 300);
}
+ @Rule
+ public ExpectedException exceptions = ExpectedException.none();
+
+ @Test
+ public void testOOMDuringSpill() throws Exception {
+ final UnsafeExternalSorter sorter = newSorter();
+ UnsafeInMemorySorter unsafeInMemSorter = sorter.inMemSorter;
+ for (int i = 0; unsafeInMemSorter.hasSpaceForAnotherRecord(); ++i) {
+ insertNumber(sorter, i);
+ }
+ // todo: this might actually not be zero if pageSize is somehow
configured differently,
+ // so we actually have to compute the expected spill size according to
the configured page size
+ assertEquals(0, sorter.getSpillSize());
+ // we expect the next insert to attempt growing the pointerssArray
+ // first allocation is expected to fail, then a spill is triggered
which attempts another allocation
+ // which also fails and we expect to see this OOM here.
+ // the original code messed with a released array within the spill code
+ // and ended up with a failed assertion.
+ // we also expect the location of the OOM to be
org.apache.spark.util.collection.unsafe.sort.UnsafeInMemorySorter.reset
+ memoryManager.markConseqOOM(2);
+ exceptions.expect(OutOfMemoryError.class);
--- End diff --
I'm used to scalatest's Matchers, things could be so much easier if this
test was written in Scala...
Will rewrite...
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]