[ https://issues.apache.org/jira/browse/SPARK-22961?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16311771#comment-16311771 ]
Apache Spark commented on SPARK-22961: -------------------------------------- User 'adrian-ionescu' has created a pull request for this issue: https://github.com/apache/spark/pull/20155 > Constant columns no longer picked as constraints in 2.3 > ------------------------------------------------------- > > Key: SPARK-22961 > URL: https://issues.apache.org/jira/browse/SPARK-22961 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 2.3.0, 3.0.0 > Reporter: Adrian Ionescu > Labels: constraints, optimizer, regression > > We're no longer picking up {{x = 2}} as a constraint from something like > {{df.withColumn("x", lit(2))}} > The unit test below succeeds in {{branch-2.2}}: > {code} > test("constraints should be inferred from aliased literals") { > val originalLeft = testRelation.subquery('left).as("left") > val optimizedLeft = testRelation.subquery('left).where(IsNotNull('a) && > 'a <=> 2).as("left") > val right = Project(Seq(Literal(2).as("two")), > testRelation.subquery('right)).as("right") > val condition = Some("left.a".attr === "right.two".attr) > val original = originalLeft.join(right, Inner, condition) > val correct = optimizedLeft.join(right, Inner, condition) > comparePlans(Optimize.execute(original.analyze), correct.analyze) > } > {code} > but fails in {{branch-2.3}} with: > {code} > == FAIL: Plans do not match === > 'Join Inner, (two#0 = a#0) 'Join Inner, (two#0 = a#0) > !:- Filter isnotnull(a#0) :- Filter ((2 <=> a#0) && > isnotnull(a#0)) > : +- LocalRelation <empty>, [a#0, b#0, c#0] : +- LocalRelation <empty>, > [a#0, b#0, c#0] > +- Project [2 AS two#0] +- Project [2 AS two#0] > +- LocalRelation <empty>, [a#0, b#0, c#0] +- LocalRelation <empty>, > [a#0, b#0, c#0] > {code} -- This message was sent by Atlassian JIRA (v6.4.14#64029) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org