tangrizzly commented on code in PR #53941:
URL: https://github.com/apache/spark/pull/53941#discussion_r2723196457
##########
sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala:
##########
@@ -410,15 +412,52 @@ class SparkSqlAstBuilder extends AstBuilder {
/**
* Create a [[SetCatalogCommand]] logical command.
+ *
+ * SET CATALOG is case-sensitive and supports multiple forms:
+ *
+ * - Simple identifier: SET CATALOG my_catalog
+ * - String literal: SET CATALOG 'my_catalog'
+ * - identifier() function: SET CATALOG identifier('my_catalog')
+ * - CAST, CONCAT, or other foldable expressions: SET CATALOG
CAST('my_catalog' AS STRING)
+ *
+ * Note: SET CATALOG <session_temp_variable> is not supported and users
should use SET CATALOG
+ * IDENTIFIER(<session_temp_variable>) instead.
*/
override def visitSetCatalog(ctx: SetCatalogContext): LogicalPlan =
withOrigin(ctx) {
- withCatalogIdentClause(ctx.catalogIdentifierReference, identifiers => {
- if (identifiers.size > 1) {
- // can occur when user put multipart string in IDENTIFIER(...) clause
- throw QueryParsingErrors.invalidNameForSetCatalog(identifiers, ctx)
+ val expr = expression(ctx.expression())
+
+ def buildSetCatalogCommand(nameParts: Seq[String]): SetCatalogCommand = {
+ if (nameParts.size > 1) {
+ throw QueryParsingErrors.invalidNameForSetCatalog(nameParts, ctx)
}
- SetCatalogCommand(identifiers.head)
- })
+ SetCatalogCommand(nameParts.head)
+ }
+
+ expr match {
+ // Directly resolve string literals (e.g., 'testcat' or "testcat")
+ case Literal(catalogName: UTF8String, StringType) =>
+ SetCatalogCommand(catalogName.toString)
+
+ // Directly resolve simple identifiers (e.g., my_catalog)
+ case UnresolvedAttribute(nameParts) =>
+ buildSetCatalogCommand(nameParts)
+
+ // Special handling for identifier() function - extract inner expression
+ case ExpressionWithUnresolvedIdentifier(identifierExpr, _, _) =>
+ PlanWithUnresolvedIdentifier(
+ identifierExpr,
+ Nil,
+ (identifiers, _) => buildSetCatalogCommand(identifiers)
+ )
+
+ // For other foldable expressions (CAST, CONCAT, etc.), resolve in
analysis phase
Review Comment:
Good question. I tested it out, and it turned out that those foldable
expressions are `UnresolvedAttribute` at this stage, e.g.,
`UnresolvedAttribute("cast")`, and they need to be resolved by the analyzer to
a real `Cast(...)` expression, before we call `.foldable` and `.eval()`. I'm
following how we handle the expression
[here](https://github.com/tangrizzly/spark/blob/8e1e126c68aaa94f0fa6861b93042e1385615418/sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala#L186)
to push down the expression to analyzer to check foldability and evaluate.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]