jelly-1203 commented on a change in pull request #18017:
URL: https://github.com/apache/flink/pull/18017#discussion_r765958845



##########
File path: 
flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/operations/MergeTableLikeUtil.java
##########
@@ -494,7 +494,13 @@ private void collectPhysicalFieldsTypes(List<SqlNode> 
derivedColumns) {
                     boolean nullable = type.getNullable() == null ? true : 
type.getNullable();
                     RelDataType relType = type.deriveType(sqlValidator, 
nullable);
                     // add field name and field type to physical field list
-                    physicalFieldNamesToTypes.put(name, relType);
+                    RelDataType oldType = physicalFieldNamesToTypes.put(name, 
relType);

Review comment:
       > I think it is not enough to add check here, when there is name 
conflicts between computedColumn or metadata Column, the check here would not 
work well. you can try to add validation in appendDerivedColumns.
   
   Hi, @wenlong88  Thanks for your advice, which is of great help to me. I 
found several problems during the testing process
   1. Duplicate columns are overwritten when the compute column is first and 
the regular column is last
   2. If the metadata column is first, use the compute column in the middle and 
the regular column in the rear. If the metadata column name is the same as the 
regular column name, use the compute column in the middle and the regular 
column name. In the generated comput column process, namely in 
accessibleFieldNamesToTypes. PutAll methods covered in repeated fields.
   
   I have adjusted the code accordingly and added test for adjustment. Please 
help to review it




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to