This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.0 by this push:
     new 4bcba6f  [SPARK-31131][SQL] Remove the unnecessary config 
spark.sql.legacy.timeParser.enabled
4bcba6f is described below

commit 4bcba6fa61e4edac3a616403af35d7e2b093fed3
Author: Kent Yao <yaooq...@hotmail.com>
AuthorDate: Thu Mar 12 09:24:49 2020 -0700

    [SPARK-31131][SQL] Remove the unnecessary config 
spark.sql.legacy.timeParser.enabled
    
    spark.sql.legacy.timeParser.enabled should be removed from SQLConf and the 
migration guide
    spark.sql.legacy.timeParsePolicy is the right one
    
    fix doc
    
    no
    
    Pass the jenkins
    
    Closes #27889 from yaooqinn/SPARK-31131.
    
    Authored-by: Kent Yao <yaooq...@hotmail.com>
    Signed-off-by: Dongjoon Hyun <dongj...@apache.org>
    (cherry picked from commit 7b4b29e8d955b43daa9ad28667e4fadbb9fce49a)
    Signed-off-by: Dongjoon Hyun <dongj...@apache.org>
---
 docs/sql-migration-guide.md                                       | 2 +-
 .../src/main/scala/org/apache/spark/sql/internal/SQLConf.scala    | 8 --------
 2 files changed, 1 insertion(+), 9 deletions(-)

diff --git a/docs/sql-migration-guide.md b/docs/sql-migration-guide.md
index e7ac9f0..1081079 100644
--- a/docs/sql-migration-guide.md
+++ b/docs/sql-migration-guide.md
@@ -67,7 +67,7 @@ license: |
 
   - Since Spark 3.0, Proleptic Gregorian calendar is used in parsing, 
formatting, and converting dates and timestamps as well as in extracting 
sub-components like years, days and etc. Spark 3.0 uses Java 8 API classes from 
the java.time packages that based on ISO chronology 
(https://docs.oracle.com/javase/8/docs/api/java/time/chrono/IsoChronology.html).
 In Spark version 2.4 and earlier, those operations are performed by using the 
hybrid calendar (Julian + Gregorian, see https://docs.orac [...]
 
-    - Parsing/formatting of timestamp/date strings. This effects on CSV/JSON 
datasources and on the `unix_timestamp`, `date_format`, `to_unix_timestamp`, 
`from_unixtime`, `to_date`, `to_timestamp` functions when patterns specified by 
users is used for parsing and formatting. Since Spark 3.0, the conversions are 
based on `java.time.format.DateTimeFormatter`, see 
https://docs.oracle.com/javase/8/docs/api/java/time/format/DateTimeFormatter.html.
 New implementation performs strict checking o [...]
+    - Parsing/formatting of timestamp/date strings. This effects on CSV/JSON 
datasources and on the `unix_timestamp`, `date_format`, `to_unix_timestamp`, 
`from_unixtime`, `to_date`, `to_timestamp` functions when patterns specified by 
users is used for parsing and formatting. Since Spark 3.0, we define our own 
pattern strings in `sql-ref-datetime-pattern.md`, which is implemented via 
`java.time.format.DateTimeFormatter` under the hood. New implementation 
performs strict checking of its in [...]
 
     - The `weekofyear`, `weekday`, `dayofweek`, `date_trunc`, 
`from_utc_timestamp`, `to_utc_timestamp`, and `unix_timestamp` functions use 
java.time API for calculation week number of year, day number of week as well 
for conversion from/to TimestampType values in UTC time zone.
 
diff --git 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
index 06180f6..ba25a68 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
@@ -2234,14 +2234,6 @@ object SQLConf {
       .checkValue(_ > 0, "The value of spark.sql.addPartitionInBatch.size must 
be positive")
       .createWithDefault(100)
 
-  val LEGACY_TIME_PARSER_ENABLED = 
buildConf("spark.sql.legacy.timeParser.enabled")
-    .internal()
-    .doc("When set to true, java.text.SimpleDateFormat is used for formatting 
and parsing " +
-      "dates/timestamps in a locale-sensitive manner. When set to false, 
classes from " +
-      "java.time.* packages are used for the same purpose.")
-    .booleanConf
-    .createWithDefault(false)
-
   val LEGACY_ALLOW_HASH_ON_MAPTYPE = 
buildConf("spark.sql.legacy.allowHashOnMapType")
     .doc("When set to true, hash expressions can be applied on elements of 
MapType. Otherwise, " +
       "an analysis exception will be thrown.")


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to