HuangZhenQiu commented on a change in pull request #10371: 
[FLINK-14953][formats] use table type to build parquet FilterPredicate
URL: https://github.com/apache/flink/pull/10371#discussion_r355160821
 
 

 ##########
 File path: 
flink-formats/flink-parquet/src/main/java/org/apache/flink/formats/parquet/ParquetTableSource.java
 ##########
 @@ -447,8 +453,16 @@ private String getColumnName(BinaryComparison comp) {
 
        @Nullable
        private Tuple2<Column, Comparable> 
extractColumnAndLiteral(BinaryComparison comp) {
-               TypeInformation<?> typeInfo = getLiteralType(comp);
                String columnName = getColumnName(comp);
+               TypeInformation<?> typeInfo = null;
+               try {
+                       String[] columnPath = columnName.split("\\.");
+                       Type type = parquetSchema.getType(columnPath);
+                       typeInfo = 
ParquetSchemaConverter.convertParquetTypeToTypeInfo(type);
+               } catch (InvalidRecordException e) {
 
 Review comment:
   It is not needed if users use sql. But as the applyPredicate function is 
public, it is possible for user to specify an undefined field in expression. In 
ParquetTableSourceTest, also cover the scenarios. I catch the exception here to 
make sure original test case can pass.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to