srowen commented on a change in pull request #23457: [SPARK-26539][CORE] Remove
spark.memory.useLegacyMode and StaticMemoryManager
URL: https://github.com/apache/spark/pull/23457#discussion_r245480446
##########
File path: core/src/test/scala/org/apache/spark/storage/MemoryStoreSuite.scala
##########
@@ -291,11 +290,11 @@ class MemoryStoreSuite
blockInfoManager.removeBlock("b3")
putIteratorAsBytes("b3", smallIterator, ClassTag.Any)
- // Unroll huge block with not enough space. This should fail and kick out
b2 in the process.
+ // Unroll huge block with not enough space.
val result4 = putIteratorAsBytes("b4", bigIterator, ClassTag.Any)
assert(result4.isLeft) // unroll was unsuccessful
assert(!memoryStore.contains("b1"))
- assert(!memoryStore.contains("b2"))
+ assert(memoryStore.contains("b2")) // not necessarily evicted
Review comment:
Oops yes b2 takes 4000 and b4 40000. OK, certainly b4 can't fit. Yes on
thinking this through a second time, this isn't quite the right change. The
difference between total and storage memory in the unified memory manager is
irrelevant here. It's just that this test needs less total memory available --
a little more than 8000 bytes, not 12000. Let me push a fix.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]