cloud-fan commented on a change in pull request #27642: [SPARK-30885][SQL] V1 
table name should be fully qualified if catalog name is provided
URL: https://github.com/apache/spark/pull/27642#discussion_r383281551
 
 

 ##########
 File path: 
sql/core/src/test/scala/org/apache/spark/sql/connector/DataSourceV2SQLSessionCatalogSuite.scala
 ##########
 @@ -19,13 +19,14 @@ package org.apache.spark.sql.connector
 
 import org.apache.spark.sql.{DataFrame, SaveMode}
 import org.apache.spark.sql.connector.catalog.{Identifier, Table, TableCatalog}
+import 
org.apache.spark.sql.connector.catalog.CatalogManager.SESSION_CATALOG_NAME
 
 class DataSourceV2SQLSessionCatalogSuite
   extends InsertIntoTests(supportsDynamicOverwrite = true, includeSQLOnlyTests 
= true)
   with AlterTableTests
   with SessionCatalogTest[InMemoryTable, InMemoryTableSessionCatalog] {
 
-  override protected val catalogAndNamespace = ""
+  override protected val catalogAndNamespace = "default."
 
 Review comment:
   thinking about this more. can we avoid changing it by updating the test 
cases that check error message?
   
   I've already seen similar code, e.g. in `InsertIntoTests`
   ```
   val tableName = if (catalogAndNamespace.isEmpty) s"default.$t1" else t1
   assert(exc.getMessage.contains(s"Cannot write to '$tableName', too many data 
columns"))
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to