szehon-ho commented on code in PR #50109: URL: https://github.com/apache/spark/pull/50109#discussion_r2008483160
########## sql/core/src/test/scala/org/apache/spark/sql/connector/ProcedureSuite.scala: ########## @@ -40,15 +40,23 @@ class ProcedureSuite extends QueryTest with SharedSparkSession with BeforeAndAft before { spark.conf.set(s"spark.sql.catalog.cat", classOf[InMemoryCatalog].getName) + spark.conf.set(s"spark.sql.catalog.cat2", classOf[InMemoryCatalog].getName) + + // needed for switching back and forth between catalogs + sql("create database cat.default") Review Comment: Yea found it, this is because of https://github.com/apache/spark/blob/master/sql/core/src/test/scala/org/apache/spark/sql/test/SQLTestUtils.scala#L383 , the test fixture does 'use default' at the end. Changed to use 'withNamepsace' ########## sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/CatalogPlugin.java: ########## @@ -74,4 +74,12 @@ public interface CatalogPlugin { default String[] defaultNamespace() { return new String[0]; } + + /** + * + * @return true if catalog supports multi-part namespaces + */ + default boolean supportsMultiPartNamespace() { + return false; Review Comment: Makes sense, changed to not populate 'namespace' unless there's multi-part. There's not a ton of value in having an empty array just because the catalog supports it, for a non-multi-part-namespace procedure. ########## docs/sql-ref-ansi-compliance.md: ########## @@ -648,6 +648,7 @@ Below is a list of all the keywords in Spark SQL. |PRECEDING|non-reserved|non-reserved|non-reserved| |PRIMARY|reserved|non-reserved|reserved| |PRINCIPALS|non-reserved|non-reserved|non-reserved| +|PROCEDURES|reserved|non-reserved|non-reserved| Review Comment: Changed to non-reserved -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org