[ https://issues.apache.org/jira/browse/SPARK-14040?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15207294#comment-15207294 ]
Sunitha Kambhampati commented on SPARK-14040: --------------------------------------------- I can reproduce this on my master ( v2.0 snapshot, synced to today). I tried the first scenario from the description. > Null-safe and equality join produces incorrect result with filtered dataframe > ----------------------------------------------------------------------------- > > Key: SPARK-14040 > URL: https://issues.apache.org/jira/browse/SPARK-14040 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 1.6.0 > Environment: Ubuntu Linux 15.10 > Reporter: Denton Cockburn > > Initial issue reported here: > http://stackoverflow.com/questions/36131942/spark-join-produces-wrong-results > val b = Seq(("a", "b", 1), ("a", "b", 2)).toDF("a", "b", "c") > val a = b.where("c = 1").withColumnRenamed("a", > "filta").withColumnRenamed("b", "filtb") > a.join(b, $"filta" <=> $"a" and $"filtb" <=> $"b" and a("c") <=> > b("c"), "left_outer").show > Produces 2 rows instead of the expected 1. > a.withColumn("newc", $"c").join(b, $"filta" === $"a" and $"filtb" === > $"b" and $"newc" === b("c"), "left_outer").show > Also produces 2 rows instead of the expected 1. > The only one that seemed to work correctly was: > a.join(b, $"filta" === $"a" and $"filtb" === $"b" and a("c") === > b("c"), "left_outer").show > But that produced a warning for : > WARN Column: Constructing trivially true equals predicate, 'c#18232 = > c#18232' > As pointed out by commenter zero323: > "The second behavior looks indeed like a bug related to the fact that you > still have a.c in your data. It looks like it is picked downstream before b.c > and the evaluated condition is actually a.newc = a.c" -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org