cloud-fan commented on a change in pull request #27718: 
[SPARK-30885][SQL][FOLLOW-UP] Fix issues where some V1 commands allow tables 
that are not fully qualified.
URL: https://github.com/apache/spark/pull/27718#discussion_r386215812
 
 

 ##########
 File path: 
sql/core/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveSessionCatalog.scala
 ##########
 @@ -663,14 +666,13 @@ class ResolveSessionCatalog(
   object TempViewOrV1Table {
     def unapply(nameParts: Seq[String]): Option[Seq[String]] = nameParts match 
{
       case _ if isTempView(nameParts) => Some(nameParts)
-      case SessionCatalogAndTable(_, tbl) =>
-        if (nameParts.head == CatalogManager.SESSION_CATALOG_NAME && 
tbl.length == 1) {
-          // For name parts like `spark_catalog.t`, we need to fill in the 
default database so
-          // that the caller side won't treat it as a temp view.
-          Some(Seq(catalogManager.v1SessionCatalog.getCurrentDatabase, 
tbl.head))
-        } else {
-          Some(tbl)
+      case Seq(CatalogManager.SESSION_CATALOG_NAME, name @ _*) if 
isTempView(name) =>
+        throw new AnalysisException(s"Table or view not found: 
${nameParts.quoted}")
 
 Review comment:
   For Spark 2.4, yes we will fail with `Database 'spark_catalog' not found`. 
But in 3.0, we have a new resolution rule, and it's more reasonable to fail 
with something like `invalid namespace for session catalog` when we see 
`spark_catalog.t`.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to