Github user Ngone51 commented on a diff in the pull request:
https://github.com/apache/spark/pull/20676#discussion_r170961404
--- Diff:
core/src/main/scala/org/apache/spark/storage/memory/MemoryStore.scala ---
@@ -246,18 +246,18 @@ private[spark] class MemoryStore(
val amountToRequest = size - unrollMemoryUsedByThisBlock
keepUnrolling = reserveUnrollMemoryForThisTask(blockId,
amountToRequest, memoryMode)
if (keepUnrolling) {
- unrollMemoryUsedByThisBlock += amountToRequest
+ unrollMemoryUsedByThisBlock = size
}
+ } else if (size < unrollMemoryUsedByThisBlock) {
+ releaseUnrollMemoryForThisTask(memoryMode,
unrollMemoryUsedByThisBlock - size)
+ unrollMemoryUsedByThisBlock = size
}
if (keepUnrolling) {
val entry = entryBuilder.build()
- // Synchronize so that transfer is atomic
- memoryManager.synchronized {
- releaseUnrollMemoryForThisTask(memoryMode,
unrollMemoryUsedByThisBlock)
- val success = memoryManager.acquireStorageMemory(blockId,
entry.size, memoryMode)
- assert(success, "transferring unroll memory to storage memory
failed")
- }
--- End diff --
It's a concept difference between unroll memory and storage memory. And it
seems reasonable to transfer unroll memory to storage memory from the view of
their concept, though, it makes no big difference underlying storage memory.
I'm not sure will this change cause some side effect, if none, I'll approve of
it.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]