anchovYu commented on code in PR #38776:
URL: https://github.com/apache/spark/pull/38776#discussion_r1044807443


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/namedExpressions.scala:
##########
@@ -424,8 +424,51 @@ case class OuterReference(e: NamedExpression)
   override def qualifier: Seq[String] = e.qualifier
   override def exprId: ExprId = e.exprId
   override def toAttribute: Attribute = e.toAttribute
-  override def newInstance(): NamedExpression = OuterReference(e.newInstance())
+  override def newInstance(): NamedExpression =
+    OuterReference(e.newInstance()).setNameParts(nameParts)
   final override val nodePatterns: Seq[TreePattern] = Seq(OUTER_REFERENCE)
+
+  // optional field, the original name parts of UnresolvedAttribute before it 
is resolved to
+  // OuterReference. Used in rule ResolveLateralColumnAlias to convert 
OuterReference back to
+  // LateralColumnAliasReference.
+  var nameParts: Option[Seq[String]] = None
+  def setNameParts(newNameParts: Option[Seq[String]]): OuterReference = {

Review Comment:
   As we discussed before, I feel it is not safe to do so given the current 
solution in ResolveOuterReference that each rule is applied only once. I made 
up a query (it can't run, just for demonstration):
   ```
   SELECT *
   FROM range(1, 7)
   WHERE (
     SELECT id2
     FROM (SELECT dept * 2.0 AS id, id + 1 AS id2 FROM $testTable)) > 5
   ORDER BY id
   ```
   It is possible that `dept * 2.0` is not resolved because it needs type 
conversion, so the LCA rule doesn't apply. Then it just wraps the `id` in `id + 
1 AS id2` as OuterReference.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to