Github user mridulm commented on a diff in the pull request:
https://github.com/apache/spark/pull/21440#discussion_r202149408
--- Diff: core/src/main/scala/org/apache/spark/storage/BlockManager.scala
---
@@ -723,7 +728,9 @@ private[spark] class BlockManager(
}
if (data != null) {
- return Some(new ChunkedByteBuffer(data))
+ val chunkSize =
+ conf.getSizeAsBytes("spark.storage.memoryMapLimitForTests",
Int.MaxValue.toString).toInt
--- End diff --
nit: Make `chunkSize` as a `private` field in `BlockManager` instead of
recomputing it each time ?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]