wypoon commented on code in PR #4588:
URL: https://github.com/apache/iceberg/pull/4588#discussion_r937374663
##########
data/src/main/java/org/apache/iceberg/data/DeleteFilter.java:
##########
@@ -67,22 +72,26 @@
private final List<DeleteFile> eqDeletes;
private final Schema requiredSchema;
private final Accessor<StructLike> posAccessor;
+ private final DeleteCounter counter;
private PositionDeleteIndex deleteRowPositions = null;
- private Predicate<T> eqDeleteRows = null;
- protected DeleteFilter(String filePath, List<DeleteFile> deletes, Schema
tableSchema, Schema requestedSchema) {
+ protected DeleteFilter(String filePath, List<DeleteFile> deletes, Schema
tableSchema, Schema requestedSchema,
+ DeleteCounter counter) {
Review Comment:
Update: The `DeleteCounter` is now created in `BaseReader` and passed to the
instance of `BaseReader.SparkDeleteFilter` constructed in the
`open(FileScanTask)` method `RowDataReader`/`BatchDataReader`. The argument
still stands: we need a single counter in the reader which is used to aggregate
the delete count over all the `FileScanTask`s.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]