HeartSaVioR commented on a change in pull request #22952: [SPARK-20568][SS]
Provide option to clean up completed files in streaming query
URL: https://github.com/apache/spark/pull/22952#discussion_r256641131
##########
File path:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/FileStreamSource.scala
##########
@@ -257,16 +261,33 @@ class FileStreamSource(
* equal to `end` and will only request offsets greater than `end` in the
future.
*/
override def commit(end: Offset): Unit = {
- // No-op for now; FileStreamSource currently garbage-collects files based
on timestamp
- // and the value of the maxFileAge parameter.
+ val logOffset = FileStreamSourceOffset(end).logOffset
+
+ if (sourceOptions.cleanSource != CleanSourceMode.NO_OP) {
+ val files = metadataLog.get(Some(logOffset),
Some(logOffset)).flatMap(_._2)
+ val validFileEntities = files.filter(_.batchId == logOffset)
+ logDebug(s"completed file entries: ${validFileEntities.mkString(",")}")
+ sourceOptions.cleanSource match {
+ case CleanSourceMode.ARCHIVE =>
+ validFileEntities.foreach(sourceCleaner.archive)
+
+ case CleanSourceMode.DELETE =>
Review comment:
I guess I don't touch the mechanism how file source deals with files, which
means it doesn't change the behavior. (Please correct me if I'm missing here.)
That's why I need to check how current file stream source deals with new file
with same name. If the behavior of current master and this patch are same and
it doesn't work as we expect/want, addressing it is beyond this PR and better
to be filed and addressed separately.
Another thing to note is, we need to also consider that some options like
`fileNameOnly` which make things more complicated.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]