Ngone51 commented on a change in pull request #23457: [SPARK-26539][CORE]
Remove spark.memory.useLegacyMode and StaticMemoryManager
URL: https://github.com/apache/spark/pull/23457#discussion_r245470208
##########
File path: core/src/test/scala/org/apache/spark/storage/MemoryStoreSuite.scala
##########
@@ -291,11 +290,11 @@ class MemoryStoreSuite
blockInfoManager.removeBlock("b3")
putIteratorAsBytes("b3", smallIterator, ClassTag.Any)
- // Unroll huge block with not enough space. This should fail and kick out
b2 in the process.
+ // Unroll huge block with not enough space.
val result4 = putIteratorAsBytes("b4", bigIterator, ClassTag.Any)
assert(result4.isLeft) // unroll was unsuccessful
assert(!memoryStore.contains("b1"))
- assert(!memoryStore.contains("b2"))
+ assert(memoryStore.contains("b2")) // not necessarily evicted
Review comment:
isn't b2 takes 4000 and b4 takes 40000 ?
I verified the test and I think the assertion is correct.
Actually, for current behaviour with `UnifiedMemoryManager`, there may be
12000 bytes available. When we try to put b4 into memory, it will require 24064
memory once it unrolled 16 elements (which hit the `memoryCheckPeriod` for
first time). And 24064 bytes exceed the maxMemory of 12000, so, it would just
return instread of evicting any blocks.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]