rdblue commented on a change in pull request #4060:
URL: https://github.com/apache/iceberg/pull/4060#discussion_r801824508
##########
File path:
flink/v1.14/flink/src/main/java/org/apache/iceberg/flink/source/DataIterator.java
##########
@@ -41,16 +42,49 @@
private final FileScanTaskReader<T> fileScanTaskReader;
private final InputFilesDecryptor inputFilesDecryptor;
- private Iterator<FileScanTask> tasks;
+ private final CombinedScanTask combinedTask;
+
+ private Iterator<FileScanTask> fileTasksIterator;
private CloseableIterator<T> currentIterator;
+ private int fileOffset;
+ private long recordOffset;
public DataIterator(FileScanTaskReader<T> fileScanTaskReader,
CombinedScanTask task,
FileIO io, EncryptionManager encryption) {
this.fileScanTaskReader = fileScanTaskReader;
this.inputFilesDecryptor = new InputFilesDecryptor(task, io, encryption);
- this.tasks = task.files().iterator();
+ this.combinedTask = task;
+
+ this.fileTasksIterator = task.files().iterator();
this.currentIterator = CloseableIterator.empty();
+
+ // fileOffset starts at -1 because we started
+ // from an empty iterator that is not from the split files.
+ this.fileOffset = -1;
+ this.recordOffset = 0L;
Review comment:
It looks like this position is before the current record, in contrast to
the position stored with each record, which is the position of the next record.
I think that's worth noting in the class Javadoc. I think the model is that you
keep track of the last record consumed and use the offset from that record to
initialize this class. So the first record of a split will be
`RowAndPosition((row data ...), (file=0, record=1))` and if you were to resume
from that record, Flink would call `seek(0, 1)` and produce the second row in
file 0. Right?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]