maropu commented on a change in pull request #29585:
URL: https://github.com/apache/spark/pull/29585#discussion_r484230287



##########
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/LogicalPlan.scala
##########
@@ -203,3 +203,58 @@ abstract class BinaryNode extends LogicalPlan {
 abstract class OrderPreservingUnaryNode extends UnaryNode {
   override final def outputOrdering: Seq[SortOrder] = child.outputOrdering
 }
+
+object LogicalPlanIntegrity {
+
+  private def canGetOutputAttrs(p: LogicalPlan): Boolean = {
+    p.resolved && !p.expressions.exists { e =>
+      e.collectFirst {
+        // We cannot call `output` in plans with a `ScalarSubquery` expr 
having no column,
+        // so, we filter out them in advance.
+        case s: ScalarSubquery if s.plan.schema.fields.isEmpty => true
+      }.isDefined
+    }
+  }
+
+  /**
+   * Since some logical plans (e.g., `Union`) can build `AttributeReference`s 
in their `output`,
+   * this method checks if the same `ExprId` refers to a semantically-equal 
attribute
+   * in a plan output.
+   */
+  def hasUniqueExprIdsForOutput(plan: LogicalPlan): Boolean = {
+    val allOutputAttrs = plan.collect { case p if canGetOutputAttrs(p) =>
+      p.output.filter(_.resolved).map(_.canonicalized.asInstanceOf[Attribute])
+    }
+    val groupedAttrsByExprId = allOutputAttrs
+      .flatten.groupBy(_.exprId).values.map(_.distinct)
+    groupedAttrsByExprId.forall(_.length == 1)
+  }
+
+  /**
+   * This method checks if reference `ExprId`s are not reused when assigning a 
new `ExprId`.
+   * For example, it returns false if plan transformers create an alias having 
the same `ExprId`
+   * with one of reference attributes, e.g., `a#1 + 1 AS a#1`.
+   */
+  def checkIfSameExprIdNotReused(plan: LogicalPlan): Boolean = {
+    plan.collect { case p if p.resolved =>
+      val inputExprIds = p.inputSet.filter(_.resolved).map(_.exprId).toSet
+      val newExprIds = p.expressions.filter(_.resolved).flatMap { e =>
+        e.collect {
+          // Only accepts the case of aliases renaming foldable expressions, 
e.g.,
+          // `FoldablePropagation` generates this renaming pattern.
+          case a: Alias if !a.child.foldable => a.exprId

Review comment:
       I added this filter for avoindg the failure below;
   ```
   // SQLWindowFunctionSuite"
   [info] - SPARK-7595: Window will cause resolve failed with self join *** 
FAILED *** (264 milliseconds)
   [info]   org.apache.spark.sql.catalyst.errors.package$TreeNodeException: 
After applying rule org.apache.spark.sql.catalyst.optimizer.FoldablePropagation 
in batch Operator Optimization before Inferring Filters, the structural 
integrity of the plan is broken., tree:
   [info] GlobalLimit 1
   [info] +- LocalLimit 1
   [info]    +- Sort [0 ASC NULLS FIRST], true
   [info]       +- Project [0 AS key#495, cnt_val#500L]               --- (2)
   [info]          +- Join Cross, (0 = 0)
   [info]             :- Project [0 AS key#495]                       --- (1)
   [info]             :  +- Window [0]
   [info]             :     +- Project [0 AS key#495, 1 AS value#496]
   [info]             :        +- OneRowRelation
   [info]             +- Project [0 AS key#501, cnt_val#500L]
   [info]                +- Window [count(1) windowspecdefinition(0, 
specifiedwindowframe(RowFrame, unboundedpreceding$(), unboundedfollowing$())) 
AS cnt_val#500L], [0]
   [info]                   +- Project [0 AS key#501, 1 AS value#502]
   [info]                      +- OneRowRelation
   ```
   I cannot find a way to fix this issue in `FoldablePropagation` because I 
think we cannot simply create plan remapping rules for `Analyzer.rewritePlan`.
   
   Let's say that we create simple two rules below;
   ```
   (1) Project [0 AS key#495]                 ==> Project [0 AS key#501]
   (2) Project [0 AS key#495, cnt_val#500L]   ==> Project [0 AS key#502, 
cnt_val#500L]
   ```
   In this case, `attrMapping` in `Analyzer.rewritePlan` has duplicate entries 
(key#495 -> {#501, #502})
   in the `Sort` node and the assertion below fails:
   ```
   [info]   java.lang.AssertionError: assertion failed: Found duplicate rewrite 
attributes
   ```
   
https://github.com/apache/spark/blob/04f7f6dac0b9177e11482cca4e7ebf7b7564e45f/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala#L162-L164
   




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to