MaxGekk commented on a change in pull request #23150: [SPARK-26178][SQL] Use java.time API for parsing timestamps and dates from CSV URL: https://github.com/apache/spark/pull/23150#discussion_r242091080
########## File path: docs/sql-migration-guide-upgrade.md ########## @@ -33,6 +33,8 @@ displayTitle: Spark SQL Upgrading Guide - Spark applications which are built with Spark version 2.4 and prior, and call methods of `UserDefinedFunction`, need to be re-compiled with Spark 3.0, as they are not binary compatible with Spark 3.0. + - Since Spark 3.0, CSV datasource uses java.time API for parsing and generating CSV content. New formatting implementation supports date/timestamp patterns conformed to ISO 8601. To switch back to the implementation used in Spark 2.4 and earlier, set `spark.sql.legacy.timeParser.enabled` to `true`. Review comment: I see the same but it is interesting that: ```scala scala> spark.range(1).map(_ => org.apache.spark.sql.internal.SQLConf.get.legacyTimeParserEnabled).show +-----+ |value| +-----+ | true| +-----+ ``` It seems when an instance of `CSVInferSchema` is created SQL configs haven't been set on executor side yet. ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
