Github user peter-toth commented on the issue:

    https://github.com/apache/spark/pull/22817
  
    So based on the UT results it seems that simply changing the resolution to 
bottom-up causes issues with `LambdaFunction`s in the current version of Spark.
    
    The issue seems to be a regression, and caused by this commit: 
https://github.com/apache/spark/commit/36b826f5d17ae7be89135cb2c43ff797f9e7fe48
    
    But reverting to bottom-up isn't simply viable in `ResolveReferences` since 
`LambdaFunction`s were introduced. The issue is that `ResolveReferences` have 
to be top-down and stop traversing the expression tree on an unbound 
`LambdaFunction` as a parameter attribute of a function can collide with an 
attribute from a DF. Such a colliding parameter attribute should not be 
resolved in `ResolveReferences` but in `ResolveLambdaVariables`. 


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to