luoyuxia commented on code in PR #20016:
URL: https://github.com/apache/flink/pull/20016#discussion_r1037722863


##########
flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/connectors/hive/HiveTableSource.java:
##########
@@ -257,6 +279,12 @@ public void applyProjection(int[][] projectedFields, 
DataType producedDataType)
         this.projectedFields = Arrays.stream(projectedFields).mapToInt(value 
-> value[0]).toArray();
     }
 
+    @Override
+    public Result applyFilters(List<ResolvedExpression> filters) {
+        this.filters = new ArrayList<>(filters);
+        return Result.of(new ArrayList<>(filters), new ArrayList<>(filters));

Review Comment:
   I don't think indenify format in HiveTableSource is possible considering 
different partitions may have different formats and we don't know which 
partitions it expect to read.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to