singhpk234 commented on code in PR #13979:
URL: https://github.com/apache/iceberg/pull/13979#discussion_r2934281004
##########
spark/v4.1/spark-extensions/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveViews.scala:
##########
@@ -102,12 +150,21 @@ case class ResolveViews(spark: SparkSession) extends
Rule[LogicalPlan] with Look
}
}
- private def createViewRelation(nameParts: Seq[String], view: View):
LogicalPlan = {
+ private def createViewRelation(
+ nameParts: Seq[String],
+ view: View,
+ existingChain: Option[Seq[Seq[String]]] = None): LogicalPlan = {
val parsed = parseViewText(nameParts.quoted, view.query)
// Apply any necessary rewrites to preserve correct resolution
val viewCatalogAndNamespace: Seq[String] = view.currentCatalog +:
view.currentNamespace.toSeq
- val rewritten = rewriteIdentifiers(parsed, viewCatalogAndNamespace);
+ // Build the view chain: prepend existing chain (from outer views) with
the current view
+ val currentViewParts = viewCatalogAndNamespace ++ Seq(nameParts.last)
Review Comment:
Thats a good point, i think we should resources / entities to be in one
catalog as rest catalog is blind sided in these scenario, in addition catalog
names can also be aliases as if both `test` and `prod` belong to same catalog
uri just different Rest Client.
I would propose just include the stuff from one catalog and have additional
checks of the catalog for which we are accumating things and when resolving
single part identifier we make sure if we are including it its the same as
session catalog. let me add checks for these
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]