cloud-fan commented on a change in pull request #26667: [SPARK-29922][SQL] SHOW
FUNCTIONS should do multi-catalog resolution
URL: https://github.com/apache/spark/pull/26667#discussion_r351157715
##########
File path:
sql/core/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveSessionCatalog.scala
##########
@@ -455,6 +455,25 @@ class ResolveSessionCatalog(
ShowTablePropertiesCommand(
tableName.asTableIdentifier,
propertyKey)
+
+ case ShowFunctionsStatement(scope, pattern, functionName) =>
+ import ShowFunctionsStatement._
+ val userScope = scope.map(s => s == ALL || s == USER).getOrElse(true)
+ val systemScope = scope.map(s => s == ALL || s == SYSTEM).getOrElse(true)
+ val (db, function) = functionName match {
Review comment:
Currently the table loop up here doesn't respect multi-catalog. e.g. what
happens for `SHOW FUNCTIONS myCata.ns1`?
You have a point that we should fail nicely for v2 catalogs. How about
```
case ShowFunctionsStatement(userScope, systemScope, pattern,
CatalogAndIdentifierParts(catalog, functionName)) =>
if (isSessionCatalog(catalog)) ... else fail...
```
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]