ueshin commented on a change in pull request #23534: [SPARK-26610][PYTHON] Fix inconsistency between toJSON Method in Python and Scala. URL: https://github.com/apache/spark/pull/23534#discussion_r251281383
########## File path: docs/sql-migration-guide-upgrade.md ########## @@ -45,6 +45,8 @@ displayTitle: Spark SQL Upgrading Guide - In Spark version 2.4 and earlier, if `org.apache.spark.sql.functions.udf(Any, DataType)` gets a Scala closure with primitive-type argument, the returned UDF will return null if the input values is null. Since Spark 3.0, the UDF will return the default value of the Java type if the input value is null. For example, `val f = udf((x: Int) => x, IntegerType)`, `f($"x")` will return null in Spark 2.4 and earlier if column `x` is null, and return 0 in Spark 3.0. This behavior change is introduced because Spark 3.0 is built with Scala 2.12 by default. + - Since Spark 3.0, `DataFrame.toJSON()` in PySpark returns `DataFrame` of JSON string instead of `RDD`. The method in Scala/Java was changed to return `DataFrame` before, but the one in PySpark was not changed at that time. If you still want to return `RDD`, you can restore the previous behavior by setting `spark.sql.legacy.pyspark.toJsonShouldReturnDataFrame` to `false`. Review comment: Actually I'm still feeling it's inconsistent because the abstraction layer is different between RDD and DataFrame/Dataset. I guess users expect it returns something of the same abstraction, i.e., RDD returns RDD, DataFrame returns DataFrame, so I'd rather handle Dataset as DataFrame in Python than RDD.We might still need to discuss the function `map` or the behavior of `DataFrameReader.csv`/`.json` as @HyukjinKwon mentioned. ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org