planga82 commented on a change in pull request #26667: [SPARK-29922][SQL] SHOW 
FUNCTIONS should do multi-catalog resolution
URL: https://github.com/apache/spark/pull/26667#discussion_r351153354
 
 

 ##########
 File path: 
sql/core/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveSessionCatalog.scala
 ##########
 @@ -455,6 +455,25 @@ class ResolveSessionCatalog(
       ShowTablePropertiesCommand(
         tableName.asTableIdentifier,
         propertyKey)
+
+    case ShowFunctionsStatement(scope, pattern, functionName) =>
+      import ShowFunctionsStatement._
+      val userScope = scope.map(s => s == ALL || s == USER).getOrElse(true)
+      val systemScope = scope.map(s => s == ALL || s == SYSTEM).getOrElse(true)
+      val (db, function) = functionName match {
 
 Review comment:
   I am in doubt with this in the case that the catalog is not a session 
catalog. We have to resolve catagog, but if the catalog is not a session 
catalog I think we have to show a good error message. When we do the same in 
other commands we do something like this to generate an exception
   ` if (!isSessionCatalog(catalog)) {
         throw new AnalysisException(s"$sql is only supported with v1 tables.")
       }`
   If we do something like this
   `case ShowFunctionsStatement(..., SessionCatalog(_, functionName))`
   It becomes an unresolved exception because any case match,  and that error 
is not so clear.
   
   This is the reason why I do in that way but it's posible I'm missing 
something.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to