peter-toth commented on a change in pull request #23531: [SPARK-24497][SQL]
Support recursive SQL query
URL: https://github.com/apache/spark/pull/23531#discussion_r252151161
##########
File path:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala
##########
@@ -202,19 +203,58 @@ class Analyzer(
Batch("Subquery", Once,
UpdateOuterReferences),
Batch("Cleanup", fixedPoint,
- CleanupAliases)
+ CleanupAliases),
+ Batch("Recursion", Once,
+ CheckRecursionConstraints)
)
+ object ResolveRecursiveReferneces extends Rule[LogicalPlan] {
Review comment:
Hmm, I don't think we can move. Resolving a `RecursiveTable` has 2 phases.
Resolving the anchor terms first which let us resolve the table's
`UnresolvedRecursiveReference`s to `RecursiveReference`s (in
rule`ResolveRecursiveReferneces`) and so in the second phase the
`RecursiveTable` can be resolved by the usual rules.
At the point when `CTESubstitution` runs we can't insert resolved
`RecursiveReference`s because we don't know the attributes coming from the
anchors.
Placing `UnresolvedRecursiveReference` when `CTESubstitution` runs postpones
resolving the recursive terms until we have the necessary information to
resolve them properly.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]