cloud-fan commented on a change in pull request #23150: [SPARK-26178][SQL] Use 
java.time API for parsing timestamps and dates from CSV
URL: https://github.com/apache/spark/pull/23150#discussion_r242044996
 
 

 ##########
 File path: docs/sql-migration-guide-upgrade.md
 ##########
 @@ -33,6 +33,8 @@ displayTitle: Spark SQL Upgrading Guide
 
   - Spark applications which are built with Spark version 2.4 and prior, and 
call methods of `UserDefinedFunction`, need to be re-compiled with Spark 3.0, 
as they are not binary compatible with Spark 3.0.
 
+  - Since Spark 3.0, CSV datasource uses java.time API for parsing and 
generating CSV content. New formatting implementation supports date/timestamp 
patterns conformed to ISO 8601. To switch back to the implementation used in 
Spark 2.4 and earlier, set `spark.sql.legacy.timeParser.enabled` to `true`.
 
 Review comment:
   I'm surprised it doesn't work, as this pattern of using SQLConf appears in 
many places.
   
   Can you create a ticket for it? Is this only a problem when setting conf via 
spark shell?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to