srowen commented on a change in pull request #23495: [SPARK-26503][CORE] Get
rid of spark.sql.legacy.timeParser.enabled
URL: https://github.com/apache/spark/pull/23495#discussion_r385236409
##########
File path:
sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/json/JsonSuite.scala
##########
@@ -1451,109 +1451,6 @@ class JsonSuite extends QueryTest with
SharedSQLContext with TestJsonData {
})
}
- test("backward compatibility") {
- withSQLConf(SQLConf.LEGACY_TIME_PARSER_ENABLED.key -> "true") {
- // This test we make sure our JSON support can read JSON data generated
by previous version
- // of Spark generated through toJSON method and JSON data source.
- // The data is generated by the following program.
- // Here are a few notes:
- // - Spark 1.5.0 cannot save timestamp data. So, we manually added
timestamp field (col13)
- // in the JSON object.
- // - For Spark before 1.5.1, we do not generate UDTs. So, we manually
added the UDT value to
- // JSON objects generated by those Spark versions (col17).
- // - If the type is NullType, we do not write data out.
Review comment:
I defer to @MaxGekk and @cloud-fan on that thread. It's trading one set of
problems for another but it could be the right thing. We will never get rid of
the legacy behavior now, I'm pretty sure :)
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]