imback82 commented on a change in pull request #27776: [SPARK-31024][SQL] Allow 
specifying session catalog name `spark_catalog` in qualified column names for 
v1 tables
URL: https://github.com/apache/spark/pull/27776#discussion_r387234149
 
 

 ##########
 File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
 ##########
 @@ -769,9 +771,9 @@ class SessionCatalog(
         desc = metadata,
         output = metadata.schema.toAttributes,
         child = parser.parsePlan(viewText))
-      SubqueryAlias(table, db, child)
+      SubqueryAlias(multiParts, child)
     } else {
-      SubqueryAlias(table, db, UnresolvedCatalogRelation(metadata))
+      SubqueryAlias(multiParts, UnresolvedCatalogRelation(metadata))
 
 Review comment:
   @cloud-fan Now that session catalog name is inserted as a qualifier, 
`matchWithTwoOrLessQualifierParts` will not be used for resolving columns for a 
table. If we need this in Spark 3.0, I could try to update that function to 
handle 3 parts qualifier `catalog_name.db.table`.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to