Ryan Blue commented on SPARK-17995:

[~cloud_fan] and [~yhuai], I'd like to help fix this, but I'm not sure the best 

I started to write an analyzer rule that uses transformUp on the initial 
logical plan, before unresovled aliases are resolved. That rule would find 
outer joins and generate a map of attributes to replace to the new attribute 
above the join, with the schema updated to be nullable and with a new exprId. 
The attributes to replace come from the output of the outer join.

Where I ran into trouble was in replacing the attributes in the logical plan 
above the a join. I don't think it is a good idea to have cases in the rule for 
every possible plan, so I think we need a method to substitute attributes that 
is implemented by nodes in the plan. That sounds like a larger patch than I 
originally thought, so I wanted to make sure I'm going down the right path 
before I put up a PR for it. What do you think?

> Use new attributes for columns from outer joins
> -----------------------------------------------
>                 Key: SPARK-17995
>                 URL: https://issues.apache.org/jira/browse/SPARK-17995
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 1.6.2, 2.0.0, 2.1.0
>            Reporter: Ryan Blue
> Plans involving outer joins use the same attribute reference (by exprId) to 
> reference columns above the join and below the join. This is a false 
> equivalence that leads to bugs like SPARK-16181, in which an attributes were 
> incorrectly replaced by the optimizer. The column has a different schema 
> above the outer join because its values may be null. The fix for that issue, 
> [PR #13884](https://github.com/apache/spark/pull/13884) has a TODO comment 
> from [~cloud_fan] to fix this by using different attributes instead of 
> needing to special-case outer joins in rules and this issue is to track that 
> improvement.

This message was sent by Atlassian JIRA

To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to