fsk119 commented on a change in pull request #15317:
URL: https://github.com/apache/flink/pull/15317#discussion_r672768162



##########
File path: 
flink-table/flink-sql-parser/src/main/resources/org.apache.flink.sql.parser.utils/ParserResource.properties
##########
@@ -19,3 +19,4 @@
 MultipleWatermarksUnsupported=Multiple WATERMARK statements is not supported 
yet.
 OverwriteIsOnlyUsedWithInsert=OVERWRITE expression is only used with INSERT 
statement.
 createSystemFunctionOnlySupportTemporary=CREATE SYSTEM FUNCTION is not 
supported, system functions can only be registered as temporary function, you 
can use CREATE TEMPORARY SYSTEM FUNCTION instead.
+explainDetailIsDuplicate=EXPLAINDETAIL are duplicate.

Review comment:
       The key use `is` but the value use `are`. 

##########
File path: 
flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/operations/SqlToOperationConverter.java
##########
@@ -946,16 +947,23 @@ private Operation convertShowViews(SqlShowViews 
sqlShowViews) {
     private Operation convertRichExplain(SqlRichExplain sqlExplain) {
         Operation operation;
         SqlNode sqlNode = sqlExplain.getStatement();
+        Set<String> explainDetails = sqlExplain.getExplainDetails();
+
+        //  Link to FLINK-22155,EXPLAIN statement should validate insert and 
query.If sql is a

Review comment:
       Please add one space after punctuation marks. I think we can remove this?

##########
File path: 
flink-table/flink-table-planner/src/test/scala/org/apache/flink/table/api/TableEnvironmentTest.scala
##########
@@ -877,6 +934,9 @@ class TableEnvironmentTest {
     tableEnv.executeSql(viewDDL)
   }
 
+  @Rule
+  def thrown: ExpectedException = expectedException

Review comment:
       move this to the origin place.

##########
File path: 
flink-table/flink-table-planner/src/test/scala/org/apache/flink/table/api/TableEnvironmentTest.scala
##########
@@ -33,25 +31,25 @@ import org.apache.flink.table.module.ModuleEntry
 import 
org.apache.flink.table.planner.factories.utils.TestCollectionTableFactory._
 import org.apache.flink.table.planner.runtime.stream.sql.FunctionITCase.TestUDF
 import 
org.apache.flink.table.planner.runtime.stream.table.FunctionITCase.SimpleScalarFunction
-import org.apache.flink.table.planner.utils.TableTestUtil.replaceStageId
+import org.apache.flink.table.planner.utils.TableTestUtil.{replaceStageId, 
replaceStreamNodeId}
 import org.apache.flink.table.planner.utils.{TableTestUtil, 
TestTableSourceSinks}
 import org.apache.flink.table.types.DataType
 import org.apache.flink.types.Row
+
+import org.apache.calcite.plan.RelOptUtil
+import org.apache.calcite.sql.SqlExplainLevel
 import org.junit.Assert._
 import org.junit.rules.ExpectedException
 import org.junit.{Rule, Test}
 
 import _root_.java.util
+

Review comment:
       remove this line

##########
File path: 
flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/operations/SqlToOperationConverter.java
##########
@@ -946,16 +947,23 @@ private Operation convertShowViews(SqlShowViews 
sqlShowViews) {
     private Operation convertRichExplain(SqlRichExplain sqlExplain) {
         Operation operation;
         SqlNode sqlNode = sqlExplain.getStatement();
+        Set<String> explainDetails = sqlExplain.getExplainDetails();
+
+        //  Link to FLINK-22155,EXPLAIN statement should validate insert and 
query.If sql is a
+        //  INSERT statement, it will parse to RichSqlInsert. If sql is a 
SELECT statement
+        //  it will be converted to SqlSelect, but when this SELECT statement 
contains UNION
+        //  it will be converted to SqlBasicCall and it's operator is union 
instead of converted
+        //  to SqlSelect SqlNode.
         if (sqlNode instanceof RichSqlInsert) {
             operation = convertSqlInsert((RichSqlInsert) sqlNode);
-        } else if (sqlNode instanceof SqlSelect) {
+        } else if (sqlNode instanceof SqlSelect || sqlNode instanceof 
SqlBasicCall) {

Review comment:
       replace the condition with `sqlNode.getKind().belongsTo(SqlKind.QUERY)` 
as line 300. 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to