LantaoJin commented on a change in pull request #29378:
URL: https://github.com/apache/spark/pull/29378#discussion_r467661282
##########
File path: core/src/main/scala/org/apache/spark/storage/DiskBlockManager.scala
##########
@@ -81,17 +99,29 @@ private[spark] class DiskBlockManager(conf: SparkConf,
deleteFilesOnStop: Boolea
new File(subDir, filename)
}
- def getFile(blockId: BlockId): File = getFile(blockId.name)
+ /**
+ * Used only for testing.
+ */
+ private[spark] def getFile(filename: String): File =
+ getFile(localDirs, subDirs, subDirsPerLocalDir, filename)
+
+ def getFile(blockId: BlockId): File = {
+ if (containerDirEnabled && blockId.isTemp) {
+ getFile(containerDirs, subContainerDirs, subDirsPerLocalDir,
blockId.name)
+ } else {
+ getFile(localDirs, subDirs, subDirsPerLocalDir, blockId.name)
+ }
+ }
/** Check if disk block manager has a block. */
def containsBlock(blockId: BlockId): Boolean = {
- getFile(blockId.name).exists()
Review comment:
`def getFile(blockId: BlockId)` will check the blockId is temp block or
not, if is a temp block, then storage to container directory.
`def getFile(filename: String)` just storage a block to local directory.
Looks like we can change `def getFile(filename: String)` to private since
this PR changes all invoker outside class to `def getFile(blockId: BlockId)`,
such as:
```scala
- val targetFile = diskManager.getFile(targetBlockId.name)
+ val targetFile = diskManager.getFile(targetBlockId)
```
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]