yihua commented on code in PR #11871:
URL: https://github.com/apache/hudi/pull/11871#discussion_r1766152943


##########
hudi-spark-datasource/hudi-spark/src/main/antlr4/org/apache/hudi/spark/sql/parser/HoodieSqlCommon.g4:
##########
@@ -48,6 +48,7 @@
  statement
     : compactionStatement                                                      
 #compactionCommand
     | CALL multipartIdentifier   callArgumentList?    #call
+    | (DESC | DESCRIBE) TABLE? (FORMATTED | EXTENDED) tableIdentifier          
#describeTableCommand

Review Comment:
   High-level question: do we have to add our own parser to intercept `{ DESC | 
DESCRIBE } TABLE` since Spark SQL already support such statement: 
https://spark.apache.org/docs/latest/sql-ref-syntax-aux-describe-table.html?



##########
hudi-spark-datasource/hudi-spark3-common/src/main/scala/org/apache/spark/sql/hudi/analysis/HoodieSpark3Analysis.scala:
##########
@@ -225,6 +224,15 @@ case class HoodieSpark3ResolveReferences(spark: 
SparkSession) extends Rule[Logic
         case (u: UnresolvedFieldName, prop) => resolveFieldNames(cmd.table, 
u.name, u) -> prop
         case other => other
       })
+
+    // Convert to DescribeTableCommand
+    case dt @ DescribeTable(plan @ ResolvesToTable(table), formatted, 
extended, output) if dt.resolved  =>

Review Comment:
   Another approach is to add post analysis rule to convert 
`DescribeTableCommand` to `DescribeHoodieTableCommand` in 
`HoodiePostAnalysisRule` (in `HoodieAnalysis.scala`), like converting 
`CreateDataSourceTableCommand` to `CreateHoodieTableCommand`.  Would that be 
feasible and easier?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to