cloud-fan commented on a change in pull request #25247: [SPARK-28319][SQL] 
Implement SHOW TABLES for Data Source V2 Tables
URL: https://github.com/apache/spark/pull/25247#discussion_r312880616
 
 

 ##########
 File path: 
sql/core/src/test/scala/org/apache/spark/sql/sources/v2/DataSourceV2SQLSuite.scala
 ##########
 @@ -1700,6 +1704,126 @@ class DataSourceV2SQLSuite extends QueryTest with 
SharedSQLContext with BeforeAn
     }
   }
 
+  test("ShowTables: using v2 catalog") {
+    spark.sql("CREATE TABLE testcat.db.table_name (id bigint, data string) 
USING foo")
+    spark.sql("CREATE TABLE testcat.n1.n2.db.table_name (id bigint, data 
string) USING foo")
+
+    runShowTablesSql("SHOW TABLES FROM testcat.db", Seq(Row("db", 
"table_name")))
+
+    runShowTablesSql(
+      "SHOW TABLES FROM testcat.n1.n2.db",
+      Seq(Row("n1.n2.db", "table_name")))
+  }
+
+  test("ShowTables: using v2 catalog with a pattern") {
+    spark.sql("CREATE TABLE testcat.db.table (id bigint, data string) USING 
foo")
+    spark.sql("CREATE TABLE testcat.db.table_name_1 (id bigint, data string) 
USING foo")
+    spark.sql("CREATE TABLE testcat.db.table_name_2 (id bigint, data string) 
USING foo")
+    spark.sql("CREATE TABLE testcat.db2.table_name_2 (id bigint, data string) 
USING foo")
+
+    runShowTablesSql(
+      "SHOW TABLES FROM testcat.db",
+      Seq(
+        Row("db", "table"),
+        Row("db", "table_name_1"),
+        Row("db", "table_name_2")))
+
+    runShowTablesSql(
+      "SHOW TABLES FROM testcat.db LIKE '*name*'",
+      Seq(Row("db", "table_name_1"), Row("db", "table_name_2")))
+
+    runShowTablesSql(
+      "SHOW TABLES FROM testcat.db LIKE '*2'",
+      Seq(Row("db", "table_name_2")))
+  }
+
+  test("ShowTables: using v2 catalog, namespace doesn't exist") {
+    runShowTablesSql("SHOW TABLES FROM testcat.unknown", Seq())
 
 Review comment:
   v2 can support `CREATE NAMESPACE`, we have the APIs in `SupportNamespace`.
   
   I think this is another case we should discuss: how much should Spark 
restrict the semantic of a SQL command? e.g. `SHOW TABLE 
catalog.nonExistingNS`, should Spark guarantee that, this command fails if the 
namespace doesn't exist?
   
   cc @brkyvz @rdblue 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to