[jira] [Comment Edited] (SPARK-22951) count() after dropDuplicates() on emptyDataFrame returns incorrect value
[ https://issues.apache.org/jira/browse/SPARK-22951?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16311225#comment-16311225 ] Sandor Murakozi edited comment on SPARK-22951 at 1/4/18 11:36 AM: -- Adding 2.2.0 to affected version: {code} __ / __/__ ___ _/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 2.2.0 /_/ Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_152) Type in expressions to have them evaluated. Type :help for more information. scala> Seq.empty[String].toDF.dropDuplicates.count res0: Long = 0 scala> spark.emptyDataset[String].dropDuplicates.count res1: Long = 0 scala> sc.emptyRDD[String].toDF.dropDuplicates.count res2: Long = 0 scala> spark.emptyDataFrame.dropDuplicates.count res3: Long = 1 {code} was (Author: smurakozi): Adding 2.2.0 to affected version: {quote} __ / __/__ ___ _/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 2.2.0 /_/ Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_152) Type in expressions to have them evaluated. Type :help for more information. scala> Seq.empty[String].toDF.dropDuplicates.count res0: Long = 0 scala> spark.emptyDataset[String].dropDuplicates.count res1: Long = 0 scala> sc.emptyRDD[String].toDF.dropDuplicates.count res2: Long = 0 scala> spark.emptyDataFrame.dropDuplicates.count res3: Long = 1{quote} > count() after dropDuplicates() on emptyDataFrame returns incorrect value > > > Key: SPARK-22951 > URL: https://issues.apache.org/jira/browse/SPARK-22951 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 1.6.2, 2.2.0 >Reporter: Michael Dreibelbis > > here is a minimal Spark Application to reproduce: > {code} > import org.apache.spark.sql.SQLContext > import org.apache.spark.{SparkConf, SparkContext} > object DropDupesApp extends App { > > override def main(args: Array[String]): Unit = { > val conf = new SparkConf() > .setAppName("test") > .setMaster("local") > val sc = new SparkContext(conf) > val sql = SQLContext.getOrCreate(sc) > assert(sql.emptyDataFrame.count == 0) // expected > assert(sql.emptyDataFrame.dropDuplicates.count == 1) // unexpected > } > > } > {code} -- This message was sent by Atlassian JIRA (v6.4.14#64029) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Comment Edited] (SPARK-22951) count() after dropDuplicates() on emptyDataFrame returns incorrect value
[ https://issues.apache.org/jira/browse/SPARK-22951?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16311225#comment-16311225 ] Sandor Murakozi edited comment on SPARK-22951 at 1/4/18 11:35 AM: -- Adding 2.2.0 to affected version: {quote} __ / __/__ ___ _/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 2.2.0 /_/ Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_152) Type in expressions to have them evaluated. Type :help for more information. scala> Seq.empty[String].toDF.dropDuplicates.count res0: Long = 0 scala> spark.emptyDataset[String].dropDuplicates.count res1: Long = 0 scala> sc.emptyRDD[String].toDF.dropDuplicates.count res2: Long = 0 scala> spark.emptyDataFrame.dropDuplicates.count res3: Long = 1{quote} was (Author: smurakozi): Adding 2.2.0 to affected version: {{ __ / __/__ ___ _/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 2.2.0 /_/ Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_152) Type in expressions to have them evaluated. Type :help for more information. scala> Seq.empty[String].toDF.dropDuplicates.count res0: Long = 0 scala> spark.emptyDataset[String].dropDuplicates.count res1: Long = 0 scala> sc.emptyRDD[String].toDF.dropDuplicates.count res2: Long = 0 scala> spark.emptyDataFrame.dropDuplicates.count res3: Long = 1 }} > count() after dropDuplicates() on emptyDataFrame returns incorrect value > > > Key: SPARK-22951 > URL: https://issues.apache.org/jira/browse/SPARK-22951 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 1.6.2, 2.2.0 >Reporter: Michael Dreibelbis > > here is a minimal Spark Application to reproduce: > {code} > import org.apache.spark.sql.SQLContext > import org.apache.spark.{SparkConf, SparkContext} > object DropDupesApp extends App { > > override def main(args: Array[String]): Unit = { > val conf = new SparkConf() > .setAppName("test") > .setMaster("local") > val sc = new SparkContext(conf) > val sql = SQLContext.getOrCreate(sc) > assert(sql.emptyDataFrame.count == 0) // expected > assert(sql.emptyDataFrame.dropDuplicates.count == 1) // unexpected > } > > } > {code} -- This message was sent by Atlassian JIRA (v6.4.14#64029) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org