HeartSaVioR commented on a change in pull request #25488: [SPARK-28025][SS] Fix
FileContextBasedCheckpointFileManager leaking crc files
URL: https://github.com/apache/spark/pull/25488#discussion_r316937534
##########
File path:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/CheckpointFileManager.scala
##########
@@ -327,12 +327,14 @@ class FileContextBasedCheckpointFileManager(path: Path,
hadoopConf: Configuratio
override def renameTempFile(srcPath: Path, dstPath: Path,
overwriteIfPossible: Boolean): Unit = {
import Options.Rename._
fc.rename(srcPath, dstPath, if (overwriteIfPossible) OVERWRITE else NONE)
+ mayRemoveCrcFile(srcPath)
}
override def delete(path: Path): Unit = {
try {
fc.delete(path, true)
+ mayRemoveCrcFile(path)
Review comment:
Ah, thanks for pointing out.
I just realized I analyzed incorrectly. I confused classes in stack trace
hence confused the root reason of bug. I thought FileContext doesn't seem to
pick LocalFs (extends ChecksumFs), but turned out that's not the case. The root
reason is, there're two `renameInternal` methods:
```
public void renameInternal(Path src, Path dst)
public void renameInternal(final Path src, final Path dst, boolean overwrite)
```
which should be overridden to handle all cases but ChecksumFs only overrides
method with 2 params, so when latter is called FilterFs.renameInternal(...) is
called instead, and it will do rename with `RawLocalFs` as underlying
filesystem.
I'll update the description of PR and remove this line.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]