[ https://issues.apache.org/jira/browse/SPARK-25739?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Brian Jones updated SPARK-25739: -------------------------------- Environment: Databricks - 4.2 (includes Apache Spark 2.3.1, Scala 2.11) (was: Databricks - 4.2 (includes Apache Spark 2.3.1, Scala 2.11) ) > Double quote coming in as empty value even when emptyValue set as null > ---------------------------------------------------------------------- > > Key: SPARK-25739 > URL: https://issues.apache.org/jira/browse/SPARK-25739 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 2.3.2 > Environment: Databricks - 4.2 (includes Apache Spark 2.3.1, Scala > 2.11) > Reporter: Brian Jones > Priority: Major > > Example code - > {code:java} > val df = List((1,""),(2,"hello"),(3,"hi"),(4,null)).toDF("key","value") > df > .repartition(1) > .write > .mode("overwrite") > .option("nullValue", null) > .option("emptyValue", null) > .option("delimiter",",") > .option("quoteMode", "NONE") > .option("escape","\\") > .format("csv") > .save("/tmp/nullcsv/") > var out = dbutils.fs.ls("/tmp/nullcsv/") > var file = out(out.size - 1) > val x = dbutils.fs.head("/tmp/nullcsv/" + file.name) > println(x) > {code} > Output - > {code:java} > 1,"" > 3,hi > 2,hello > 4, > {code} > Expected output - > {code:java} > 1, > 3,hi > 2,hello > 4, > {code} > > [https://github.com/apache/spark/commit/b7efca7ece484ee85091b1b50bbc84ad779f9bfe] > This commit is relevant to my issue. > "Since Spark 2.4, empty strings are saved as quoted empty strings `""`. In > version 2.3 and earlier, empty strings are equal to `null` values and do not > reflect to any characters in saved CSV files." > I am on Spark version 2.3.2, so empty strings should be coming as null. Even > then, I am passing the correct "emptyValue" option. However, my empty values > are stilling coming as `""` in the written file. -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org