mridulm commented on code in PR #40412:
URL: https://github.com/apache/spark/pull/40412#discussion_r1199782358
##########
core/src/main/scala/org/apache/spark/storage/DiskBlockManager.scala:
##########
@@ -273,7 +273,7 @@ private[spark] class DiskBlockManager(
Utils.getConfiguredLocalDirs(conf).foreach { rootDir =>
try {
val mergeDir = new File(rootDir, mergeDirName)
- if (!mergeDir.exists()) {
+ if (!mergeDir.exists() || mergeDir.listFiles().length <
subDirsPerLocalDir) {
Review Comment:
I am not sure I follow the comment above.
What I meant was, instead of simply checking for expected file count - check
for the actual directories we expect to find there.
As pseudo-code, something like:
```
val dirs =
Option(mergeDir.listFiles()).map(_.filter(_.isDirectory)).map(_.toSet)
if (!mergeDir.exists() || dirs.isEmpty ||
(0 until subDirsPerLocalDir).map(dirNum => new File(mergeDir,
"%02x".format(dirNum))).exists(f => !dirs.get.contains(f)))
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]