LuciferYang commented on a change in pull request #31517:
URL: https://github.com/apache/spark/pull/31517#discussion_r677081396
##########
File path: core/src/main/scala/org/apache/spark/storage/BlockManagerId.scala
##########
@@ -136,11 +136,14 @@ private[spark] object BlockManagerId {
* The max cache size is hardcoded to 10000, since the size of a
BlockManagerId
* object is about 48B, the total memory cost should be below 1MB which is
feasible.
*/
- val blockManagerIdCache = CacheBuilder.newBuilder()
- .maximumSize(10000)
- .build(new CacheLoader[BlockManagerId, BlockManagerId]() {
- override def load(id: BlockManagerId) = id
- })
+ val blockManagerIdCache = {
Review comment:
I didn't realize that usage, my Scala level was too poor, thx ~
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]