HeartSaVioR commented on a change in pull request #22952: [SPARK-20568][SS]
Provide option to clean up completed files in streaming query
URL: https://github.com/apache/spark/pull/22952#discussion_r334714556
##########
File path:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/FileStreamSource.scala
##########
@@ -258,16 +264,33 @@ class FileStreamSource(
* equal to `end` and will only request offsets greater than `end` in the
future.
*/
override def commit(end: Offset): Unit = {
- // No-op for now; FileStreamSource currently garbage-collects files based
on timestamp
- // and the value of the maxFileAge parameter.
+ val logOffset = FileStreamSourceOffset(end).logOffset
+
+ if (sourceOptions.cleanSource != CleanSourceMode.NO_OP) {
Review comment:
That was suggested earlier and we agreed to deal with it from follow-up
issue - the PR just has been sitting longer than expected. Could we deal with
follow-up issue?
Btw, if we don't guarantee the cleanup and do it with best effort (actually
we already do it), it won't matter much to do it with background thread -
except the problem you've mentioned. That could be remedied via retaining max
amount of paths to cleanup (yes, this is based on "best effort").
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]