LadyForest commented on code in PR #19329:
URL: https://github.com/apache/flink/pull/19329#discussion_r1057594885


##########
flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/utils/OperationConverterUtils.java:
##########
@@ -144,6 +147,163 @@ public static Operation convertChangeColumn(
         // TODO: handle watermark and constraints
     }
 
+    public static Operation convertRenameColumn(
+            ObjectIdentifier tableIdentifier,
+            String originColumnName,
+            String newColumnName,
+            CatalogTable originTable,
+            ResolvedSchema originResolveSchema) {
+        Schema originSchema = originTable.getUnresolvedSchema();

Review Comment:
   Can we reuse `SchemaResolver` to simplify this check just like ALTER TABLE 
ADD?
   ```java
   Schema originSchema = originTable.getUnresolvedSchema();
           List<Schema.UnresolvedColumn> newColumns =
                   originSchema.getColumns().stream()
                           .map(
                                   column -> {
                                       if 
(column.getName().equals(originColumnName)) {
                                           return convertColumn(column, 
newColumnName);
                                       } else {
                                           return column;
                                       }
                                   })
                           .collect(Collectors.toList());
   
           Schema updatedSchema =
                   
Schema.newBuilder().fromSchema(originSchema).replaceColumns(newColumns).build();
           // try {
           //                schemaResolver.resolve(updatedSchema);
           //                return updatedSchema;
           //            } catch (Exception e) {
           //                throw new ValidationException(
           //                        String.format("%s%s", EX_MSG_PREFIX, 
e.getMessage()), e);
           //            }
           return new AlterTableSchemaOperation(
                   tableIdentifier,
                   CatalogTable.of(
                           updatedSchema,
                           originTable.getComment(),
                           originTable.getPartitionKeys(),
                           originTable.getOptions()));
   ```



##########
flink-table/flink-sql-parser/src/main/codegen/includes/parserImpls.ftl:
##########
@@ -608,6 +612,18 @@ SqlAlterTable SqlAlterTable() :
                         tableIdentifier,
                         newTableIdentifier);
         }
+    |
+        <RENAME>
+            originColumnIdentifier = SimpleIdentifier()

Review Comment:
   Do we plan to support the nested column at the syntax level, just like 
`ALTER TABLE ADD/MODIFY` did?



##########
flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/operations/SqlToOperationConverterTest.java:
##########
@@ -1258,6 +1259,119 @@ public void testAlterTable() throws Exception {
                 .hasMessageContaining("ALTER TABLE RESET does not support 
empty key");
     }
 
+    @Test
+    public void testAlterTableRenameColumn() throws Exception {

Review Comment:
   Can we reuse `prepareTable`?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to