srowen commented on a change in pull request #23495:  [SPARK-26503][CORE] Get 
rid of spark.sql.legacy.timeParser.enabled
URL: https://github.com/apache/spark/pull/23495#discussion_r384916414
 
 

 ##########
 File path: 
sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/json/JsonSuite.scala
 ##########
 @@ -1451,109 +1451,6 @@ class JsonSuite extends QueryTest with 
SharedSQLContext with TestJsonData {
     })
   }
 
-  test("backward compatibility") {
-    withSQLConf(SQLConf.LEGACY_TIME_PARSER_ENABLED.key -> "true") {
-      // This test we make sure our JSON support can read JSON data generated 
by previous version
-      // of Spark generated through toJSON method and JSON data source.
-      // The data is generated by the following program.
-      // Here are a few notes:
-      //  - Spark 1.5.0 cannot save timestamp data. So, we manually added 
timestamp field (col13)
-      //      in the JSON object.
-      //  - For Spark before 1.5.1, we do not generate UDTs. So, we manually 
added the UDT value to
-      //      JSON objects generated by those Spark versions (col17).
-      //  - If the type is NullType, we do not write data out.
 
 Review comment:
    The test is gone because the old behavior is gone; that's all that's going 
on here.
   See the OP with a link to the actual change. The key discussions were:
   
   https://github.com/apache/spark/pull/23391#discussion_r244414750
   https://github.com/apache/spark/pull/23391#discussion_r244627486
   
   I think the TL;DR is that the legacy behavior is error-prone and already 
susceptible to getting wrong answers for old dates. That seems worth 'fixing' 
despite the forced behavior change.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to