[ 
https://issues.apache.org/jira/browse/SPARK-26677?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16752606#comment-16752606
 ] 

Ryan Blue commented on SPARK-26677:
-----------------------------------

To clarify [~dongjoon]'s comment: All recent versions of Parquet are affected 
by this {{not(eqNullSafe(...)}} bug. Only Parquet 1.10.0 is affected by 
PARQUET-1309.

This filter bug has been present since Parquet introduced dictionary filtering.

> Incorrect results of not(eqNullSafe) when data read from Parquet file 
> ----------------------------------------------------------------------
>
>                 Key: SPARK-26677
>                 URL: https://issues.apache.org/jira/browse/SPARK-26677
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.4.0
>         Environment: Local installation of Spark on Linux (Java 1.8, Ubuntu 
> 18.04).
>            Reporter: Michal Kapalka
>            Priority: Blocker
>              Labels: correctness
>
> Example code (spark-shell from Spark 2.4.0):
> {code:java}
> scala> Seq("A", "A", null).toDS.repartition(1).write.parquet("t")
> scala> spark.read.parquet("t").where(not(col("value").eqNullSafe("A"))).show
> +-----+
> |value|
> +-----+
> +-----+
> {code}
> Running the same with Spark 2.2.0 or 2.3.2 gives the correct result:
> {code:java}
> scala> spark.read.parquet("t").where(not(col("value").eqNullSafe("A"))).show
> +-----+
> |value|
> +-----+
> | null|
> +-----+
> {code}
> Also, with a different input sequence and Spark 2.4.0 we get the correct 
> result:
> {code:java}
> scala> Seq("A", null).toDS.repartition(1).write.parquet("t")
> scala> spark.read.parquet("t").where(not(col("value").eqNullSafe("A"))).show
> +-----+
> |value|
> +-----+
> | null|
> +-----+
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to