10110346 commented on a change in pull request #23679: [SPARK-23516][CORE]
Avoid repeated release/acquire on storage memory
URL: https://github.com/apache/spark/pull/23679#discussion_r252089033
##########
File path:
core/src/main/scala/org/apache/spark/storage/memory/MemoryStore.scala
##########
@@ -566,7 +566,10 @@ private[spark] class MemoryStore(
* Release memory used by this task for unrolling blocks.
* If the amount is not specified, remove the current task's allocation
altogether.
*/
- def releaseUnrollMemoryForThisTask(memoryMode: MemoryMode, memory: Long =
Long.MaxValue): Unit = {
+ def releaseUnrollMemoryForThisTask(
+ memoryMode: MemoryMode,
+ memory: Long = Long.MaxValue,
+ releaseMemoryReally: Boolean = true): Unit = {
Review comment:
Yeah, this modification is not very good.
Whether to add a new method just to release memory from
`onHeapUnrollMemoryMap` or `offHeapUnrollMemoryMap`?
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]