mihailom-db commented on code in PR #47364:
URL: https://github.com/apache/spark/pull/47364#discussion_r1746565878


##########
sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala:
##########
@@ -1112,4 +1112,15 @@ class SparkSqlAstBuilder extends AstBuilder {
       withIdentClause(ctx.identifierReference(), UnresolvedNamespace(_)),
       cleanedProperties)
   }
+
+  /**
+   * Create a [[ShowCollationsCommand]] command.
+   * Expected format:
+   * {{{
+   *   SHOW COLLATIONS (LIKE? pattern)?;
+   * }}}
+   */
+  override def visitShowCollations(ctx: ShowCollationsContext): LogicalPlan = 
withOrigin(ctx) {

Review Comment:
   We can do two things here. Follow completely functions implementation, but 
just throw a new error that user defined collations are not supported, or just 
keep SHOW COLLATIONS (LIKE? pattern=stringLit)? and make sure we implement the 
listing of collation in a similar way as functions. Second approach might be 
more clear for now, as we would just throw parse exception if they tried to use 
some user defined collation specific implementation. ((FROM | IN) 
ns=identifierReference)? part of the parsing is a good add-on, but for now we 
are not going to use it. In showFunctions we only use identifier? and ((FROM | 
IN) ns=identifierReference)? when we are listing udfs and temporary functions 
from specific catalogs/schemas.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to