axel dahl created SPARK-8685:
--------------------------------

             Summary: dataframe left joins are not working as expected in 
pyspark
                 Key: SPARK-8685
                 URL: https://issues.apache.org/jira/browse/SPARK-8685
             Project: Spark
          Issue Type: Bug
          Components: PySpark, SQL
    Affects Versions: 1.4.0
         Environment: ubuntu 14.04
            Reporter: axel dahl


I have the following code:

from pyspark import SQLContext

d1 = [{'name':'bob', 'country': 'usa', 'age': 1},
{'name':'alice', 'country': 'jpn', 'age': 2}, 
{'name':'carol', 'country': 'ire', 'age': 3}]

d2 = [{'name':'bob', 'country': 'usa', 'colour':'red'},
{'name':'carol', 'country': 'ire', 'colour':'green'}]

r1 = sc.parallelize(d1)
r2 = sc.parallelize(d2)

sqlContext = SQLContext(sc)
df1 = sqlContext.createDataFrame(d1)
df2 = sqlContext.createDataFrame(d2)
df1.join(df2, df1.name == df2.name and df1.country == df2.country, 
'left_outer').collect()


When I run it I get the following, (notice in the first row, all join keys are 
take from the right-side and so are blanked out):

[Row(age=2, country=None, name=None, colour=None, country=None, name=None),
Row(age=1, country=u'usa', name=u'bob', colour=u'red', country=u'usa', 
name=u'bob'),
Row(age=3, country=u'ire', name=u'carol', colour=u'green', country=u'ire', 
name=u'alice')]

I would expect to get (though ideally without duplicate columns):
[Row(age=2, country=u'ire', name=u'alice', colour=None, country=None, 
name=None),
Row(age=1, country=u'usa', name=u'bob', colour=u'red', country=u'usa', 
name=u'bob'),
Row(age=3, country=u'ire', name=u'carol', colour=u'green', country=u'ire', 
name=u'alice')]

The workaround for now is this rather clunky piece of code:
df2 = sqlContext.createDataFrame(d2).withColumnRenamed('name', 
'name2').withColumnRenamed('country', 'country2')
df1.join(df2, df1.name == df2.name2 and df1.country == df2.country2, 
'left_outer').collect()




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to