cloud-fan commented on a change in pull request #31273:
URL: https://github.com/apache/spark/pull/31273#discussion_r578363697
##########
File path:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala
##########
@@ -1101,9 +1116,9 @@ class Analyzer(override val catalogManager:
CatalogManager)
// The view's child should be a logical plan parsed from the
`desc.viewText`, the variable
// `viewText` should be defined, or else we throw an error on the
generation of the View
// operator.
- case view @ View(desc, isTempView, child) if !child.resolved =>
+ case view @ View(Some(desc), isTempView, child) if !child.resolved =>
// Resolve all the UnresolvedRelations and Views in the child.
- val newChild = AnalysisContext.withAnalysisContext(desc) {
+ val newChild = AnalysisContext.withAnalysisContext(desc, isTempView) {
Review comment:
I'm a bit confused here. For the new SQL temp view, we use
`referredTempViewNames` to decide if we need to resolve temp views inside the
view definition. What's wrong with it?
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]