zhangjun0x01 commented on a change in pull request #1822:
URL: https://github.com/apache/iceberg/pull/1822#discussion_r535792817
##########
File path: flink/src/main/java/org/apache/iceberg/flink/IcebergTableSource.java
##########
@@ -36,25 +36,30 @@
/**
* Flink Iceberg table source.
- * TODO: Implement {@link FilterableTableSource} and {@link
LimitableTableSource}.
+ * TODO: Implement {@link FilterableTableSource}
*/
-public class IcebergTableSource implements StreamTableSource<RowData>,
ProjectableTableSource<RowData> {
+public class IcebergTableSource
+ implements StreamTableSource<RowData>, ProjectableTableSource<RowData>,
LimitableTableSource<RowData> {
private final TableLoader loader;
private final TableSchema schema;
private final Map<String, String> properties;
private final int[] projectedFields;
+ private boolean isLimitPushDown = false;
+ private long limit = -1L;
public IcebergTableSource(TableLoader loader, TableSchema schema,
Map<String, String> properties) {
- this(loader, schema, properties, null);
+ this(loader, schema, properties, null, false, -1);
Review comment:
I think it may be related to the design of the `LimitableTableSource`
interface in flink 1.11. I looked up some implement classes of
LimitableTableSource in flink, such as
[HiveTableSource](https://github.com/apache/flink/blob/release-1.11/flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/connectors/hive/HiveTableSource.java#L123).
By default, the limit pushdown is disabled
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]