[ 
https://issues.apache.org/jira/browse/SPARK-27134?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Josh Rosen updated SPARK-27134:
-------------------------------
    Labels: correctness  (was: )

> array_distinct function does not work correctly with columns containing array 
> of array
> --------------------------------------------------------------------------------------
>
>                 Key: SPARK-27134
>                 URL: https://issues.apache.org/jira/browse/SPARK-27134
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.4.0
>         Environment: Spark 2.4, scala 2.11.11
>            Reporter: Mike Trenaman
>            Assignee: Dilip Biswal
>            Priority: Major
>              Labels: correctness
>             Fix For: 2.4.1, 3.0.0
>
>
> The array_distinct function introduced in spark 2.4 is producing strange 
> results when used on an array column which contains a nested array. The 
> resulting output can still contain duplicate values, and furthermore, 
> previously distinct values may be removed.
> This is easily repeatable, e.g. with this code:
> val df = Seq(
>  Seq(Seq(1, 2), Seq(1, 2), Seq(1, 2), Seq(3, 4), Seq(4, 5))
>  ).toDF("Number_Combinations")
> val dfWithDistinct = df.withColumn("distinct_combinations",
>  array_distinct(col("Number_Combinations")))
>  
> The initial 'df' DataFrame contains one row, where column 
> 'Number_Combinations' contains the following values:
> [[1, 2], [1, 2], [1, 2], [3, 4], [4, 5]]
>  
> The array_distinct function run on this column produces a new column 
> containing the following values:
> [[1, 2], [1, 2], [1, 2]]
>  
> As you can see, this contains three occurrences of the same value (1, 2), and 
> furthermore, the distinct values (3, 4), (4, 5) have been removed.
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to