This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.0 by this push:
     new 4ee13b1  [SPARK-31879][SQL][TEST-JAVA11] Make week-based pattern 
invalid for formatting too
4ee13b1 is described below

commit 4ee13b1214d007538aa0a1e3382c51b917594f43
Author: Kent Yao <yaooq...@hotmail.com>
AuthorDate: Fri Jun 5 08:14:01 2020 +0000

    [SPARK-31879][SQL][TEST-JAVA11] Make week-based pattern invalid for 
formatting too
    
    After all these attempts https://github.com/apache/spark/pull/28692 and 
https://github.com/apache/spark/pull/28719 an 
https://github.com/apache/spark/pull/28727.
    they all have limitations as mentioned in their discussions.
    
    Maybe the only way is to forbid them all
    
    These week-based fields need Locale to express their semantics, the first 
day of the week varies from country to country.
    
    From the Java doc of WeekFields
    ```java
        /**
         * Gets the first day-of-week.
         * <p>
         * The first day-of-week varies by culture.
         * For example, the US uses Sunday, while France and the ISO-8601 
standard use Monday.
         * This method returns the first day using the standard {code 
DayOfWeek} enum.
         *
         * return the first day-of-week, not null
         */
        public DayOfWeek getFirstDayOfWeek() {
            return firstDayOfWeek;
        }
    ```
    
    But for the SimpleDateFormat, the day-of-week is not localized
    
    ```
    u   Day number of week (1 = Monday, ..., 7 = Sunday)        Number  1
    ```
    
    Currently, the default locale we use is the US, so the result moved a day 
or a year or a week backward.
    
    e.g.
    
    For the date `2019-12-29(Sunday)`, in the Sunday Start system(e.g. en-US), 
it belongs to 2020 of week-based-year, in the Monday Start system(en-GB), it 
goes to 2019. the week-of-week-based-year(w) will be affected too
    
    ```sql
    spark-sql> SELECT to_csv(named_struct('time', to_timestamp('2019-12-29', 
'yyyy-MM-dd')), map('timestampFormat', 'YYYY', 'locale', 'en-US'));
    2020
    spark-sql> SELECT to_csv(named_struct('time', to_timestamp('2019-12-29', 
'yyyy-MM-dd')), map('timestampFormat', 'YYYY', 'locale', 'en-GB'));
    2019
    
    spark-sql> SELECT to_csv(named_struct('time', to_timestamp('2019-12-29', 
'yyyy-MM-dd')), map('timestampFormat', 'YYYY-ww-uu', 'locale', 'en-US'));
    2020-01-01
    spark-sql> SELECT to_csv(named_struct('time', to_timestamp('2019-12-29', 
'yyyy-MM-dd')), map('timestampFormat', 'YYYY-ww-uu', 'locale', 'en-GB'));
    2019-52-07
    
    spark-sql> SELECT to_csv(named_struct('time', to_timestamp('2020-01-05', 
'yyyy-MM-dd')), map('timestampFormat', 'YYYY-ww-uu', 'locale', 'en-US'));
    2020-02-01
    spark-sql> SELECT to_csv(named_struct('time', to_timestamp('2020-01-05', 
'yyyy-MM-dd')), map('timestampFormat', 'YYYY-ww-uu', 'locale', 'en-GB'));
    2020-01-07
    ```
    
    For other countries, please refer to [First Day of the Week in Different 
Countries](http://chartsbin.com/view/41671)
    
    With this change, user can not use 'YwuW',  but 'e' for 'u' instead. This 
can at least turn this not to be a silent data change.
    
    add unit tests
    
    Closes #28728 from yaooqinn/SPARK-31879-NEW2.
    
    Authored-by: Kent Yao <yaooq...@hotmail.com>
    Signed-off-by: Wenchen Fan <wenc...@databricks.com>
    (cherry picked from commit 9d5b5d0a5849ac329bbae26d9884d8843d8a8571)
    Signed-off-by: Wenchen Fan <wenc...@databricks.com>
---
 docs/sql-ref-datetime-pattern.md                   |  14 +-
 .../spark/sql/catalyst/util/DateFormatter.scala    |  10 +-
 .../catalyst/util/DateTimeFormatterHelper.scala    |  34 +-
 .../sql/catalyst/util/TimestampFormatter.scala     |  12 +-
 .../expressions/DateExpressionsSuite.scala         |   2 +-
 .../{ => catalyst}/util/DateFormatterSuite.scala   |  13 +-
 .../util/DateTimeFormatterHelperSuite.scala        |  14 +-
 .../sql/catalyst/util/DatetimeFormatterSuite.scala |  54 +++
 .../util/TimestampFormatterSuite.scala             |  61 ++-
 .../inputs/datetime-formatting-invalid.sql         |  53 +++
 .../inputs/datetime-formatting-legacy.sql          |   2 +
 .../sql-tests/inputs/datetime-formatting.sql       |  68 ++++
 .../test/resources/sql-tests/inputs/datetime.sql   |   4 -
 .../sql-tests/results/ansi/datetime.sql.out        |  28 +-
 .../results/datetime-formatting-invalid.sql.out    | 335 ++++++++++++++++
 .../results/datetime-formatting-legacy.sql.out     | 401 +++++++++++++++++++
 .../sql-tests/results/datetime-formatting.sql.out  | 431 +++++++++++++++++++++
 .../sql-tests/results/datetime-legacy.sql.out      |  28 +-
 .../resources/sql-tests/results/datetime.sql.out   |  28 +-
 19 files changed, 1421 insertions(+), 171 deletions(-)

diff --git a/docs/sql-ref-datetime-pattern.md b/docs/sql-ref-datetime-pattern.md
index 5859ad8..3c0bc75 100644
--- a/docs/sql-ref-datetime-pattern.md
+++ b/docs/sql-ref-datetime-pattern.md
@@ -36,11 +36,7 @@ Spark uses pattern letters in the following table for date 
and timestamp parsing
 |**M/L**|month-of-year|month|7; 07; Jul; July|
 |**d**|day-of-month|number(3)|28|
 |**Q/q**|quarter-of-year|number/text|3; 03; Q3; 3rd quarter|
-|**Y**|week-based-year|year|1996; 96|
-|**w**|week-of-week-based-year|number(2)|27|
-|**W**|week-of-month|number(1)|4|
 |**E**|day-of-week|text|Tue; Tuesday|
-|**u**|localized day-of-week|number/text|2; 02; Tue; Tuesday|
 |**F**|week-of-month|number(1)|3|
 |**a**|am-pm-of-day|am-pm|PM|
 |**h**|clock-hour-of-am-pm (1-12)|number(2)|12|
@@ -63,7 +59,7 @@ Spark uses pattern letters in the following table for date 
and timestamp parsing
 
 The count of pattern letters determines the format.
 
-- Text: The text style is determined based on the number of pattern letters 
used. Less than 4 pattern letters will use the short form. Exactly 4 pattern 
letters will use the full form. Exactly 5 pattern letters will use the narrow 
form. 5 or more letters will fail.
+- Text: The text style is determined based on the number of pattern letters 
used. Less than 4 pattern letters will use the short text form, typically an 
abbreviation, e.g. day-of-week Monday might output "Mon". Exactly 4 pattern 
letters will use the full text form, typically the full description, e.g, 
day-of-week Monday might output "Monday". 5 or more letters will fail.
 
 - Number(n): The n here represents the maximum count of letters this type of 
datetime pattern can be used. If the count of letters is one, then the value is 
output using the minimum number of digits and without padding. Otherwise, the 
count of digits is used as the width of the output field, with the value 
zero-padded as necessary.
 
@@ -137,10 +133,4 @@ The count of pattern letters determines the format.
   During parsing, the whole section may be missing from the parsed string.
   An optional section is started by `[` and ended using `]` (or at the end of 
the pattern).
   
-- Symbols of 'Y', 'W', 'w', 'E', 'u', 'F', 'q' and 'Q' can only be used for 
datetime formatting, e.g. `date_format`. They are not allowed used for datetime 
parsing, e.g. `to_timestamp`.
-
-More details for the text style:
-
-- Short Form: Short text, typically an abbreviation. For example, day-of-week 
Monday might output "Mon".
-
-- Full Form: Full text, typically the full description. For example, 
day-of-week Monday might output "Monday".
+- Symbols of 'E', 'F', 'q' and 'Q' can only be used for datetime formatting, 
e.g. `date_format`. They are not allowed used for datetime parsing, e.g. 
`to_timestamp`.
diff --git 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateFormatter.scala
 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateFormatter.scala
index b3347eb..fbc9f56 100644
--- 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateFormatter.scala
+++ 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateFormatter.scala
@@ -46,7 +46,7 @@ class Iso8601DateFormatter(
   extends DateFormatter with DateTimeFormatterHelper {
 
   @transient
-  private lazy val formatter = getOrCreateFormatter(pattern, locale)
+  private lazy val formatter = getOrCreateFormatter(pattern, locale, isParsing)
 
   @transient
   private lazy val legacyFormatter = DateFormatter.getLegacyFormatter(
@@ -127,7 +127,7 @@ object DateFormatter {
       zoneId: ZoneId,
       locale: Locale = defaultLocale,
       legacyFormat: LegacyDateFormat = LENIENT_SIMPLE_DATE_FORMAT,
-      isParsing: Boolean = true): DateFormatter = {
+      isParsing: Boolean): DateFormatter = {
     val pattern = format.getOrElse(defaultPattern)
     if (SQLConf.get.legacyTimeParserPolicy == LEGACY) {
       getLegacyFormatter(pattern, zoneId, locale, legacyFormat)
@@ -160,11 +160,11 @@ object DateFormatter {
     getFormatter(Some(format), zoneId, locale, legacyFormat, isParsing)
   }
 
-  def apply(format: String, zoneId: ZoneId): DateFormatter = {
-    getFormatter(Some(format), zoneId)
+  def apply(format: String, zoneId: ZoneId, isParsing: Boolean = false): 
DateFormatter = {
+    getFormatter(Some(format), zoneId, isParsing = isParsing)
   }
 
   def apply(zoneId: ZoneId): DateFormatter = {
-    getFormatter(None, zoneId)
+    getFormatter(None, zoneId, isParsing = false)
   }
 }
diff --git 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateTimeFormatterHelper.scala
 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateTimeFormatterHelper.scala
index eeb56aa..8e5c865 100644
--- 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateTimeFormatterHelper.scala
+++ 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateTimeFormatterHelper.scala
@@ -97,7 +97,7 @@ trait DateTimeFormatterHelper {
   protected def getOrCreateFormatter(
       pattern: String,
       locale: Locale,
-      isParsing: Boolean = false): DateTimeFormatter = {
+      isParsing: Boolean): DateTimeFormatter = {
     val newPattern = convertIncompatiblePattern(pattern, isParsing)
     val useVarLen = isParsing && newPattern.contains('S')
     val key = (newPattern, locale, useVarLen)
@@ -234,22 +234,27 @@ private object DateTimeFormatterHelper {
     val formatter = DateTimeFormatter.ofPattern("LLL qqq", Locale.US)
     formatter.format(LocalDate.of(2000, 1, 1)) == "1 1"
   }
-  final val unsupportedLetters = Set('A', 'c', 'e', 'n', 'N', 'p')
   // SPARK-31892: The week-based date fields are rarely used and really 
confusing for parsing values
-  // to datetime, especially when they are mixed with other non-week-based ones
+  // to datetime, especially when they are mixed with other non-week-based 
ones;
+  // SPARK-31879: It's also difficult for us to restore the behavior of 
week-based date fields
+  // formatting, in DateTimeFormatter the first day of week for week-based 
date fields become
+  // localized, for the default Locale.US, it uses Sunday as the first day of 
week, while in Spark
+  // 2.4, the SimpleDateFormat uses Monday as the first day of week.
+  final val weekBasedLetters = Set('Y', 'W', 'w', 'u', 'e', 'c')
+  final val unsupportedLetters = Set('A', 'n', 'N', 'p')
   // The quarter fields will also be parsed strangely, e.g. when the pattern 
contains `yMd` and can
   // be directly resolved then the `q` do check for whether the month is 
valid, but if the date
   // fields is incomplete, e.g. `yM`, the checking will be bypassed.
-  final val unsupportedLettersForParsing = Set('Y', 'W', 'w', 'E', 'u', 'F', 
'q', 'Q')
+  final val unsupportedLettersForParsing = Set('E', 'F', 'q', 'Q')
   final val unsupportedPatternLengths = {
     // SPARK-31771: Disable Narrow-form TextStyle to avoid silent data change, 
as it is Full-form in
     // 2.4
-    Seq("G", "M", "L", "E", "u", "Q", "q").map(_ * 5) ++
+    Seq("G", "M", "L", "E", "Q", "q").map(_ * 5) ++
       // SPARK-31867: Disable year pattern longer than 10 which will cause 
Java time library throw
       // unchecked `ArrayIndexOutOfBoundsException` by the 
`NumberPrinterParser` for formatting. It
       // makes the call side difficult to handle exceptions and easily leads 
to silent data change
       // because of the exceptions being suppressed.
-      Seq("y", "Y").map(_ * 11)
+      Seq("y").map(_ * 11)
   }.toSet
 
   /**
@@ -260,7 +265,7 @@ private object DateTimeFormatterHelper {
    * @param pattern The input pattern.
    * @return The pattern for new parser
    */
-  def convertIncompatiblePattern(pattern: String, isParsing: Boolean = false): 
String = {
+  def convertIncompatiblePattern(pattern: String, isParsing: Boolean): String 
= {
     val eraDesignatorContained = pattern.split("'").zipWithIndex.exists {
       case (patternPart, index) =>
         // Text can be quoted using single quotes, we only check the non-quote 
parts.
@@ -269,6 +274,10 @@ private object DateTimeFormatterHelper {
     (pattern + " ").split("'").zipWithIndex.map {
       case (patternPart, index) =>
         if (index % 2 == 0) {
+          for (c <- patternPart if weekBasedLetters.contains(c)) {
+            throw new IllegalArgumentException(s"All week-based patterns are 
unsupported since" +
+              s" Spark 3.0, detected: $c, Please use the SQL function EXTRACT 
instead")
+          }
           for (c <- patternPart if unsupportedLetters.contains(c) ||
             (isParsing && unsupportedLettersForParsing.contains(c))) {
             throw new IllegalArgumentException(s"Illegal pattern character: 
$c")
@@ -282,20 +291,13 @@ private object DateTimeFormatterHelper {
               "or upgrade your Java version. For more details, please read " +
               "https://bugs.openjdk.java.net/browse/JDK-8114833";)
           }
-          // The meaning of 'u' was day number of week in SimpleDateFormat, it 
was changed to year
-          // in DateTimeFormatter. Substitute 'u' to 'e' and use 
DateTimeFormatter to parse the
-          // string. If parsable, return the result; otherwise, fall back to 
'u', and then use the
-          // legacy SimpleDateFormat parser to parse. When it is successfully 
parsed, throw an
-          // exception and ask users to change the pattern strings or turn on 
the legacy mode;
-          // otherwise, return NULL as what Spark 2.4 does.
-          val res = patternPart.replace("u", "e")
           // In DateTimeFormatter, 'u' supports negative years. We substitute 
'y' to 'u' here for
           // keeping the support in Spark 3.0. If parse failed in Spark 3.0, 
fall back to 'y'.
           // We only do this substitution when there is no era designator 
found in the pattern.
           if (!eraDesignatorContained) {
-            res.replace("y", "u")
+            patternPart.replace("y", "u")
           } else {
-            res
+            patternPart
           }
         } else {
           patternPart
diff --git 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/TimestampFormatter.scala
 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/TimestampFormatter.scala
index e8866d7..6bcbb09 100644
--- 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/TimestampFormatter.scala
+++ 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/TimestampFormatter.scala
@@ -62,11 +62,11 @@ class Iso8601TimestampFormatter(
     zoneId: ZoneId,
     locale: Locale,
     legacyFormat: LegacyDateFormat = LENIENT_SIMPLE_DATE_FORMAT,
-    needVarLengthSecondFraction: Boolean)
+    isParsing: Boolean)
   extends TimestampFormatter with DateTimeFormatterHelper {
   @transient
   protected lazy val formatter: DateTimeFormatter =
-    getOrCreateFormatter(pattern, locale, needVarLengthSecondFraction)
+    getOrCreateFormatter(pattern, locale, isParsing)
 
   @transient
   protected lazy val legacyFormatter = TimestampFormatter.getLegacyFormatter(
@@ -122,7 +122,7 @@ class FractionTimestampFormatter(zoneId: ZoneId)
     zoneId,
     TimestampFormatter.defaultLocale,
     LegacyDateFormats.FAST_DATE_FORMAT,
-    needVarLengthSecondFraction = false) {
+    isParsing = false) {
 
   @transient
   override protected lazy val formatter = 
DateTimeFormatterHelper.fractionFormatter
@@ -266,7 +266,7 @@ object TimestampFormatter {
       zoneId: ZoneId,
       locale: Locale = defaultLocale,
       legacyFormat: LegacyDateFormat = LENIENT_SIMPLE_DATE_FORMAT,
-      isParsing: Boolean = false): TimestampFormatter = {
+      isParsing: Boolean): TimestampFormatter = {
     val pattern = format.getOrElse(defaultPattern)
     if (SQLConf.get.legacyTimeParserPolicy == LEGACY) {
       getLegacyFormatter(pattern, zoneId, locale, legacyFormat)
@@ -313,12 +313,12 @@ object TimestampFormatter {
   def apply(
       format: String,
       zoneId: ZoneId,
-      isParsing: Boolean = false): TimestampFormatter = {
+      isParsing: Boolean): TimestampFormatter = {
     getFormatter(Some(format), zoneId, isParsing = isParsing)
   }
 
   def apply(zoneId: ZoneId): TimestampFormatter = {
-    getFormatter(None, zoneId)
+    getFormatter(None, zoneId, isParsing = false)
   }
 
   def getFractionFormatter(zoneId: ZoneId): TimestampFormatter = {
diff --git 
a/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/DateExpressionsSuite.scala
 
b/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/DateExpressionsSuite.scala
index dcb4759..df41754 100644
--- 
a/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/DateExpressionsSuite.scala
+++ 
b/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/DateExpressionsSuite.scala
@@ -41,7 +41,7 @@ class DateExpressionsSuite extends SparkFunSuite with 
ExpressionEvalHelper {
   private val JST_OPT = Option(JST.getId)
 
   def toMillis(timestamp: String): Long = {
-    val tf = TimestampFormatter("yyyy-MM-dd HH:mm:ss", UTC)
+    val tf = TimestampFormatter("yyyy-MM-dd HH:mm:ss", UTC, isParsing = true)
     DateTimeUtils.toMillis(tf.parse(timestamp))
   }
   val date = "2015-04-08 13:10:15"
diff --git 
a/sql/catalyst/src/test/scala/org/apache/spark/sql/util/DateFormatterSuite.scala
 
b/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/util/DateFormatterSuite.scala
similarity index 96%
rename from 
sql/catalyst/src/test/scala/org/apache/spark/sql/util/DateFormatterSuite.scala
rename to 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/util/DateFormatterSuite.scala
index 22a1396..4892dea 100644
--- 
a/sql/catalyst/src/test/scala/org/apache/spark/sql/util/DateFormatterSuite.scala
+++ 
b/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/util/DateFormatterSuite.scala
@@ -15,19 +15,22 @@
  * limitations under the License.
  */
 
-package org.apache.spark.sql.util
+package org.apache.spark.sql.catalyst.util
 
 import java.time.{DateTimeException, LocalDate}
 
-import org.apache.spark.{SparkFunSuite, SparkUpgradeException}
-import org.apache.spark.sql.catalyst.plans.SQLHelper
-import org.apache.spark.sql.catalyst.util.{DateFormatter, LegacyDateFormats}
+import org.apache.spark.SparkUpgradeException
 import org.apache.spark.sql.catalyst.util.DateTimeTestUtils._
 import org.apache.spark.sql.catalyst.util.DateTimeUtils._
 import org.apache.spark.sql.internal.SQLConf
 import org.apache.spark.sql.internal.SQLConf.LegacyBehaviorPolicy
 
-class DateFormatterSuite extends SparkFunSuite with SQLHelper {
+class DateFormatterSuite extends DatetimeFormatterSuite {
+
+  override def checkFormatterCreation(pattern: String, isParsing: Boolean): 
Unit = {
+    DateFormatter(pattern, UTC, isParsing)
+  }
+
   test("parsing dates") {
     outstandingTimezonesIds.foreach { timeZone =>
       withSQLConf(SQLConf.SESSION_LOCAL_TIMEZONE.key -> timeZone) {
diff --git 
a/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/util/DateTimeFormatterHelperSuite.scala
 
b/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/util/DateTimeFormatterHelperSuite.scala
index c68bdac..0b15e49 100644
--- 
a/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/util/DateTimeFormatterHelperSuite.scala
+++ 
b/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/util/DateTimeFormatterHelperSuite.scala
@@ -22,27 +22,32 @@ import 
org.apache.spark.sql.catalyst.util.DateTimeFormatterHelper._
 
 class DateTimeFormatterHelperSuite extends SparkFunSuite {
 
+  private def convertIncompatiblePattern(pattern: String): String = {
+    DateTimeFormatterHelper.convertIncompatiblePattern(pattern, isParsing = 
false)
+  }
+
   test("check incompatible pattern") {
-    assert(convertIncompatiblePattern("MM-DD-u") === "MM-DD-e")
     assert(convertIncompatiblePattern("yyyy-MM-dd'T'HH:mm:ss.SSSz")
       === "uuuu-MM-dd'T'HH:mm:ss.SSSz")
     assert(convertIncompatiblePattern("yyyy-MM'y contains in quoted 
text'HH:mm:ss")
       === "uuuu-MM'y contains in quoted text'HH:mm:ss")
-    assert(convertIncompatiblePattern("yyyy-MM-dd-u'T'HH:mm:ss.SSSz")
-      === "uuuu-MM-dd-e'T'HH:mm:ss.SSSz")
     assert(convertIncompatiblePattern("yyyy-MM'u contains in quoted 
text'HH:mm:ss")
       === "uuuu-MM'u contains in quoted text'HH:mm:ss")
     assert(convertIncompatiblePattern("yyyy-MM'u contains in quoted 
text'''''HH:mm:ss")
       === "uuuu-MM'u contains in quoted text'''''HH:mm:ss")
     assert(convertIncompatiblePattern("yyyy-MM-dd'T'HH:mm:ss.SSSz G")
       === "yyyy-MM-dd'T'HH:mm:ss.SSSz G")
+    weekBasedLetters.foreach { l =>
+      val e = 
intercept[IllegalArgumentException](convertIncompatiblePattern(s"yyyy-MM-dd $l 
G"))
+      assert(e.getMessage.contains("week-based"))
+    }
     unsupportedLetters.foreach { l =>
       val e = 
intercept[IllegalArgumentException](convertIncompatiblePattern(s"yyyy-MM-dd $l 
G"))
       assert(e.getMessage === s"Illegal pattern character: $l")
     }
     unsupportedLettersForParsing.foreach { l =>
       val e = intercept[IllegalArgumentException] {
-        convertIncompatiblePattern(s"$l", isParsing = true)
+        DateTimeFormatterHelper.convertIncompatiblePattern(s"$l", isParsing = 
true)
       }
       assert(e.getMessage === s"Illegal pattern character: $l")
       assert(convertIncompatiblePattern(s"$l").nonEmpty)
@@ -57,7 +62,6 @@ class DateTimeFormatterHelperSuite extends SparkFunSuite {
       }
       assert(e2.getMessage === s"Too many pattern letters: ${style.head}")
     }
-    assert(convertIncompatiblePattern("yyyy-MM-dd uuuu") === "uuuu-MM-dd eeee")
     assert(convertIncompatiblePattern("yyyy-MM-dd EEEE") === "uuuu-MM-dd EEEE")
     assert(convertIncompatiblePattern("yyyy-MM-dd'e'HH:mm:ss") === 
"uuuu-MM-dd'e'HH:mm:ss")
     assert(convertIncompatiblePattern("yyyy-MM-dd'T'") === "uuuu-MM-dd'T'")
diff --git 
a/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/util/DatetimeFormatterSuite.scala
 
b/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/util/DatetimeFormatterSuite.scala
new file mode 100644
index 0000000..31ff50f
--- /dev/null
+++ 
b/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/util/DatetimeFormatterSuite.scala
@@ -0,0 +1,54 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.util
+
+import org.scalatest.Matchers
+
+import org.apache.spark.{SparkFunSuite, SparkUpgradeException}
+import org.apache.spark.sql.catalyst.plans.SQLHelper
+
+trait DatetimeFormatterSuite extends SparkFunSuite with SQLHelper with 
Matchers {
+  import DateTimeFormatterHelper._
+  def checkFormatterCreation(pattern: String, isParsing: Boolean): Unit
+
+  test("explicitly forbidden datetime patterns") {
+
+    Seq(true, false).foreach { isParsing =>
+      // not support by the legacy one too
+      val unsupportedBoth = Seq("QQQQQ", "qqqqq", "eeeee", "A", "c", "n", "N", 
"p", "e")
+      unsupportedBoth.foreach { pattern =>
+        intercept[IllegalArgumentException](checkFormatterCreation(pattern, 
isParsing))
+      }
+      // supported by the legacy one, then we will suggest users with 
SparkUpgradeException
+      ((weekBasedLetters ++ unsupportedLetters).map(_.toString)
+        ++ unsupportedPatternLengths -- unsupportedBoth).foreach {
+        pattern => 
intercept[SparkUpgradeException](checkFormatterCreation(pattern, isParsing))
+      }
+    }
+
+    // not support by the legacy one too
+    val unsupportedBoth = Seq("q", "Q")
+    unsupportedBoth.foreach { pattern =>
+      intercept[IllegalArgumentException](checkFormatterCreation(pattern, 
true))
+    }
+    // supported by the legacy one, then we will suggest users with 
SparkUpgradeException
+    (unsupportedLettersForParsing.map(_.toString) -- unsupportedBoth).foreach {
+      pattern => 
intercept[SparkUpgradeException](checkFormatterCreation(pattern, true))
+    }
+  }
+}
diff --git 
a/sql/catalyst/src/test/scala/org/apache/spark/sql/util/TimestampFormatterSuite.scala
 
b/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/util/TimestampFormatterSuite.scala
similarity index 90%
rename from 
sql/catalyst/src/test/scala/org/apache/spark/sql/util/TimestampFormatterSuite.scala
rename to 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/util/TimestampFormatterSuite.scala
index efd97a4..311097f 100644
--- 
a/sql/catalyst/src/test/scala/org/apache/spark/sql/util/TimestampFormatterSuite.scala
+++ 
b/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/util/TimestampFormatterSuite.scala
@@ -15,23 +15,23 @@
  * limitations under the License.
  */
 
-package org.apache.spark.sql.util
+package org.apache.spark.sql.catalyst.util
 
 import java.time.{DateTimeException, Instant, LocalDateTime, LocalTime}
 import java.util.concurrent.TimeUnit
 
-import org.scalatest.Matchers
-
-import org.apache.spark.{SparkFunSuite, SparkUpgradeException}
-import org.apache.spark.sql.catalyst.plans.SQLHelper
-import org.apache.spark.sql.catalyst.util.{LegacyDateFormats, 
TimestampFormatter}
+import org.apache.spark.SparkUpgradeException
 import org.apache.spark.sql.catalyst.util.DateTimeTestUtils._
 import org.apache.spark.sql.catalyst.util.DateTimeUtils._
 import org.apache.spark.sql.internal.SQLConf
 import org.apache.spark.sql.internal.SQLConf.LegacyBehaviorPolicy
 import org.apache.spark.unsafe.types.UTF8String
 
-class TimestampFormatterSuite extends SparkFunSuite with SQLHelper with 
Matchers {
+class TimestampFormatterSuite extends DatetimeFormatterSuite {
+
+  override def checkFormatterCreation(pattern: String, isParsing: Boolean): 
Unit = {
+    TimestampFormatter(pattern, UTC, isParsing)
+  }
 
   test("parsing timestamps using time zones") {
     val localDate = "2018-12-02T10:11:12.001234"
@@ -96,7 +96,7 @@ class TimestampFormatterSuite extends SparkFunSuite with 
SQLHelper with Matchers
         2177456523456789L,
         11858049903010203L).foreach { micros =>
         outstandingZoneIds.foreach { zoneId =>
-          val timestamp = TimestampFormatter(pattern, zoneId).format(micros)
+          val timestamp = TimestampFormatter(pattern, zoneId, isParsing = 
false).format(micros)
           val parsed = TimestampFormatter(
             pattern, zoneId, isParsing = true).parse(timestamp)
           assert(micros === parsed)
@@ -120,14 +120,14 @@ class TimestampFormatterSuite extends SparkFunSuite with 
SQLHelper with Matchers
         val pattern = "yyyy-MM-dd'T'HH:mm:ss.SSSSSS"
         val micros = TimestampFormatter(
           pattern, zoneId, isParsing = true).parse(timestamp)
-        val formatted = TimestampFormatter(pattern, zoneId).format(micros)
+        val formatted = TimestampFormatter(pattern, zoneId, isParsing = 
false).format(micros)
         assert(timestamp === formatted)
       }
     }
   }
 
   test("case insensitive parsing of am and pm") {
-    val formatter = TimestampFormatter("yyyy MMM dd hh:mm:ss a", UTC)
+    val formatter = TimestampFormatter("yyyy MMM dd hh:mm:ss a", UTC, 
isParsing = false)
     val micros = formatter.parse("2009 Mar 20 11:30:01 am")
     assert(micros === date(2009, 3, 20, 11, 30, 1))
   }
@@ -154,8 +154,8 @@ class TimestampFormatterSuite extends SparkFunSuite with 
SQLHelper with Matchers
     assert(TimestampFormatter(UTC).format(micros) === "-0099-01-01 00:00:00")
     assert(TimestampFormatter(UTC).format(instant) === "-0099-01-01 00:00:00")
     withDefaultTimeZone(UTC) { // toJavaTimestamp depends on the default time 
zone
-      assert(TimestampFormatter("yyyy-MM-dd HH:mm:SS G", 
UTC).format(toJavaTimestamp(micros))
-        === "0100-01-01 00:00:00 BC")
+      assert(TimestampFormatter("yyyy-MM-dd HH:mm:SS G", UTC, isParsing = 
false)
+        .format(toJavaTimestamp(micros)) === "0100-01-01 00:00:00 BC")
     }
   }
 
@@ -206,7 +206,7 @@ class TimestampFormatterSuite extends SparkFunSuite with 
SQLHelper with Matchers
         "2019-10-14T09:39:07.1", "2019-10-14T09:39:07.1")
 
       try {
-        TimestampFormatter("yyyy/MM/dd HH_mm_ss.SSSSSS", zoneId, true)
+        TimestampFormatter("yyyy/MM/dd HH_mm_ss.SSSSSS", zoneId, isParsing = 
true)
           .parse("2019/11/14 20#25#30.123456")
         fail("Expected to throw an exception for the invalid input")
       } catch {
@@ -219,7 +219,7 @@ class TimestampFormatterSuite extends SparkFunSuite with 
SQLHelper with Matchers
   test("formatting timestamp strings up to microsecond precision") {
     outstandingZoneIds.foreach { zoneId =>
       def check(pattern: String, input: String, expected: String): Unit = {
-        val formatter = TimestampFormatter(pattern, zoneId)
+        val formatter = TimestampFormatter(pattern, zoneId, isParsing = false)
         val timestamp = stringToTimestamp(UTF8String.fromString(input), 
zoneId).get
         val actual = formatter.format(timestamp)
         assert(actual === expected)
@@ -256,7 +256,7 @@ class TimestampFormatterSuite extends SparkFunSuite with 
SQLHelper with Matchers
   }
 
   test("SPARK-30958: parse timestamp with negative year") {
-    val formatter1 = TimestampFormatter("yyyy-MM-dd HH:mm:ss", UTC, true)
+    val formatter1 = TimestampFormatter("yyyy-MM-dd HH:mm:ss", UTC, isParsing 
= true)
     assert(formatter1.parse("-1234-02-22 02:22:22") === date(-1234, 2, 22, 2, 
22, 22))
 
     def assertParsingError(f: => Unit): Unit = {
@@ -269,7 +269,7 @@ class TimestampFormatterSuite extends SparkFunSuite with 
SQLHelper with Matchers
     }
 
     // "yyyy" with "G" can't parse negative year or year 0000.
-    val formatter2 = TimestampFormatter("G yyyy-MM-dd HH:mm:ss", UTC, true)
+    val formatter2 = TimestampFormatter("G yyyy-MM-dd HH:mm:ss", UTC, 
isParsing = true)
     assertParsingError(formatter2.parse("BC -1234-02-22 02:22:22"))
     assertParsingError(formatter2.parse("AC 0000-02-22 02:22:22"))
 
@@ -315,7 +315,7 @@ class TimestampFormatterSuite extends SparkFunSuite with 
SQLHelper with Matchers
   test("parsing hour with various patterns") {
     def createFormatter(pattern: String): TimestampFormatter = {
       // Use `SIMPLE_DATE_FORMAT`, so that the legacy parser also fails with 
invalid value range.
-      TimestampFormatter(pattern, UTC, LegacyDateFormats.SIMPLE_DATE_FORMAT, 
false)
+      TimestampFormatter(pattern, UTC, LegacyDateFormats.SIMPLE_DATE_FORMAT, 
isParsing = true)
     }
 
     withClue("HH") {
@@ -374,56 +374,45 @@ class TimestampFormatterSuite extends SparkFunSuite with 
SQLHelper with Matchers
   }
 
   test("missing date fields") {
-    val formatter = TimestampFormatter("HH:mm:ss", UTC)
+    val formatter = TimestampFormatter("HH:mm:ss", UTC, isParsing = true)
     val micros = formatter.parse("11:30:01")
     assert(micros === date(1970, 1, 1, 11, 30, 1))
   }
 
   test("missing year field with invalid date") {
     // Use `SIMPLE_DATE_FORMAT`, so that the legacy parser also fails with 
invalid date.
-    val formatter = TimestampFormatter("MM-dd", UTC, 
LegacyDateFormats.SIMPLE_DATE_FORMAT, false)
+    val formatter =
+      TimestampFormatter("MM-dd", UTC, LegacyDateFormats.SIMPLE_DATE_FORMAT, 
isParsing = true)
     
withDefaultTimeZone(UTC)(intercept[DateTimeException](formatter.parse("02-29")))
   }
 
   test("missing am/pm field") {
     Seq("HH", "hh", "KK", "kk").foreach { hour =>
-      val formatter = TimestampFormatter(s"yyyy $hour:mm:ss", UTC)
+      val formatter = TimestampFormatter(s"yyyy $hour:mm:ss", UTC, isParsing = 
true)
       val micros = formatter.parse("2009 11:30:01")
       assert(micros === date(2009, 1, 1, 11, 30, 1))
     }
   }
 
   test("missing time fields") {
-    val formatter = TimestampFormatter("yyyy HH", UTC)
+    val formatter = TimestampFormatter("yyyy HH", UTC, isParsing = true)
     val micros = formatter.parse("2009 11")
     assert(micros === date(2009, 1, 1, 11))
   }
 
   test("missing hour field") {
-    val f1 = TimestampFormatter("mm:ss a", UTC)
+    val f1 = TimestampFormatter("mm:ss a", UTC, isParsing = true)
     val t1 = f1.parse("30:01 PM")
     assert(t1 === date(1970, 1, 1, 12, 30, 1))
     val t2 = f1.parse("30:01 AM")
     assert(t2 === date(1970, 1, 1, 0, 30, 1))
-    val f2 = TimestampFormatter("mm:ss", UTC)
+    val f2 = TimestampFormatter("mm:ss", UTC, isParsing = true)
     val t3 = f2.parse("30:01")
     assert(t3 === date(1970, 1, 1, 0, 30, 1))
-    val f3 = TimestampFormatter("a", UTC)
+    val f3 = TimestampFormatter("a", UTC, isParsing = true)
     val t4 = f3.parse("PM")
     assert(t4 === date(1970, 1, 1, 12))
     val t5 = f3.parse("AM")
     assert(t5 === date(1970))
   }
-
-  test("explicitly forbidden datetime patterns") {
-    // not support by the legacy one too
-    Seq("QQQQQ", "qqqqq", "A", "c", "e", "n", "N", "p").foreach { pattern =>
-      intercept[IllegalArgumentException](TimestampFormatter(pattern, 
UTC).format(0))
-    }
-    // supported by the legacy one, then we will suggest users with 
SparkUpgradeException
-    Seq("GGGGG", "MMMMM", "LLLLL", "EEEEE", "uuuuu", "aa", "aaa", "y" * 11, 
"y" * 11)
-      .foreach { pattern =>
-        intercept[SparkUpgradeException](TimestampFormatter(pattern, 
UTC).format(0))
-    }
-  }
 }
diff --git 
a/sql/core/src/test/resources/sql-tests/inputs/datetime-formatting-invalid.sql 
b/sql/core/src/test/resources/sql-tests/inputs/datetime-formatting-invalid.sql
new file mode 100644
index 0000000..9072aa1
--- /dev/null
+++ 
b/sql/core/src/test/resources/sql-tests/inputs/datetime-formatting-invalid.sql
@@ -0,0 +1,53 @@
+--- TESTS FOR DATETIME FORMATTING FUNCTIONS WITH INVALID PATTERNS ---
+
+-- separating this from datetime-formatting.sql, because the text form
+-- for patterns with 5 letters in SimpleDateFormat varies from different JDKs
+select date_format('2018-11-17 13:33:33.333', 'GGGGG');
+-- pattern letter count can not be greater than 10
+select date_format('2018-11-17 13:33:33.333', 'yyyyyyyyyyy');
+-- q/L in JDK 8 will fail when the count is more than 2
+select date_format('2018-11-17 13:33:33.333', 'qqqqq');
+select date_format('2018-11-17 13:33:33.333', 'QQQQQ');
+select date_format('2018-11-17 13:33:33.333', 'MMMMM');
+select date_format('2018-11-17 13:33:33.333', 'LLLLL');
+
+select date_format('2018-11-17 13:33:33.333', 'EEEEE');
+select date_format('2018-11-17 13:33:33.333', 'FF');
+select date_format('2018-11-17 13:33:33.333', 'ddd');
+-- DD is invalid if the day-of-year exceeds 100, but it becomes valid in Java 
11
+-- select date_format('2018-11-17 13:33:33.333', 'DD');
+select date_format('2018-11-17 13:33:33.333', 'DDDD');
+select date_format('2018-11-17 13:33:33.333', 'HHH');
+select date_format('2018-11-17 13:33:33.333', 'hhh');
+select date_format('2018-11-17 13:33:33.333', 'kkk');
+select date_format('2018-11-17 13:33:33.333', 'KKK');
+select date_format('2018-11-17 13:33:33.333', 'mmm');
+select date_format('2018-11-17 13:33:33.333', 'sss');
+select date_format('2018-11-17 13:33:33.333', 'SSSSSSSSSS');
+select date_format('2018-11-17 13:33:33.333', 'aa');
+select date_format('2018-11-17 13:33:33.333', 'V');
+select date_format('2018-11-17 13:33:33.333', 'zzzzz');
+select date_format('2018-11-17 13:33:33.333', 'XXXXXX');
+select date_format('2018-11-17 13:33:33.333', 'ZZZZZZ');
+select date_format('2018-11-17 13:33:33.333', 'OO');
+select date_format('2018-11-17 13:33:33.333', 'xxxxxx');
+
+select date_format('2018-11-17 13:33:33.333', 'A');
+select date_format('2018-11-17 13:33:33.333', 'n');
+select date_format('2018-11-17 13:33:33.333', 'N');
+select date_format('2018-11-17 13:33:33.333', 'p');
+
+-- disabled week-based patterns
+select date_format('2018-11-17 13:33:33.333', 'Y');
+select date_format('2018-11-17 13:33:33.333', 'w');
+select date_format('2018-11-17 13:33:33.333', 'W');
+select date_format('2018-11-17 13:33:33.333', 'u');
+select date_format('2018-11-17 13:33:33.333', 'e');
+select date_format('2018-11-17 13:33:33.333', 'c');
+
+-- others
+select date_format('2018-11-17 13:33:33.333', 'B');
+select date_format('2018-11-17 13:33:33.333', 'C');
+select date_format('2018-11-17 13:33:33.333', 'I');
+
+
diff --git 
a/sql/core/src/test/resources/sql-tests/inputs/datetime-formatting-legacy.sql 
b/sql/core/src/test/resources/sql-tests/inputs/datetime-formatting-legacy.sql
new file mode 100644
index 0000000..19cab61
--- /dev/null
+++ 
b/sql/core/src/test/resources/sql-tests/inputs/datetime-formatting-legacy.sql
@@ -0,0 +1,2 @@
+--SET spark.sql.legacy.timeParserPolicy=LEGACY
+--IMPORT datetime-formatting.sql
\ No newline at end of file
diff --git 
a/sql/core/src/test/resources/sql-tests/inputs/datetime-formatting.sql 
b/sql/core/src/test/resources/sql-tests/inputs/datetime-formatting.sql
new file mode 100644
index 0000000..3b23a77
--- /dev/null
+++ b/sql/core/src/test/resources/sql-tests/inputs/datetime-formatting.sql
@@ -0,0 +1,68 @@
+--- TESTS FOR DATETIME FORMATTING FUNCTIONS ---
+
+create temporary view v as select col from values
+ (timestamp '1582-06-01 11:33:33.123UTC+080000'),
+ (timestamp '1970-01-01 00:00:00.000Europe/Paris'),
+ (timestamp '1970-12-31 23:59:59.999Asia/Srednekolymsk'),
+ (timestamp '1996-04-01 00:33:33.123Australia/Darwin'),
+ (timestamp '2018-11-17 13:33:33.123Z'),
+ (timestamp '2020-01-01 01:33:33.123Asia/Shanghai'),
+ (timestamp '2100-01-01 01:33:33.123America/Los_Angeles') t(col);
+
+select col, date_format(col, 'G GG GGG GGGG') from v;
+
+select col, date_format(col, 'y yy yyy yyyy yyyyy yyyyyy yyyyyyy yyyyyyyy 
yyyyyyyyy yyyyyyyyyy') from v;
+
+select col, date_format(col, 'q qq') from v;
+
+select col, date_format(col, 'Q QQ QQQ QQQQ') from v;
+
+select col, date_format(col, 'M MM MMM MMMM') from v;
+
+select col, date_format(col, 'L LL') from v;
+
+select col, date_format(col, 'E EE EEE EEEE') from v;
+
+select col, date_format(col, 'F') from v;
+
+select col, date_format(col, 'd dd') from v;
+
+select col, date_format(col, 'DD') from v where col = timestamp '2100-01-01 
01:33:33.123America/Los_Angeles';
+select col, date_format(col, 'D DDD') from v;
+
+select col, date_format(col, 'H HH') from v;
+
+select col, date_format(col, 'h hh') from v;
+
+select col, date_format(col, 'k kk') from v;
+
+select col, date_format(col, 'K KK') from v;
+
+select col, date_format(col, 'm mm') from v;
+
+select col, date_format(col, 's ss') from v;
+
+select col, date_format(col, 'S SS SSS SSSS SSSSS SSSSSS SSSSSSS SSSSSSSS 
SSSSSSSSS') from v;
+
+select col, date_format(col, 'a') from v;
+
+select col, date_format(col, 'VV') from v;
+
+select col, date_format(col, 'z zz zzz zzzz') from v;
+
+select col, date_format(col, 'X XX XXX') from v;
+select col, date_format(col, 'XXXX XXXXX') from v;
+
+select col, date_format(col, 'Z ZZ ZZZ ZZZZ ZZZZZ') from v;
+
+select col, date_format(col, 'O OOOO') from v;
+
+select col, date_format(col, 'x xx xxx xxxx xxxx xxxxx') from v;
+
+-- optional pattern, but the results won't be optional for formatting
+select col, date_format(col, '[yyyy-MM-dd HH:mm:ss]') from v;
+
+-- literals
+select col, date_format(col, "姚123'GyYqQMLwWuEFDdhHmsSaVzZxXOV'") from v;
+select col, date_format(col, "''") from v;
+select col, date_format(col, '') from v;
diff --git a/sql/core/src/test/resources/sql-tests/inputs/datetime.sql 
b/sql/core/src/test/resources/sql-tests/inputs/datetime.sql
index 4eefa0f..e4a2e5d 100644
--- a/sql/core/src/test/resources/sql-tests/inputs/datetime.sql
+++ b/sql/core/src/test/resources/sql-tests/inputs/datetime.sql
@@ -114,10 +114,6 @@ select to_timestamp("12.1234019-10-06S10:11", 
"ss.SSSSy-MM-dd'S'HH:mm");
 select to_timestamp("2019-10-06S", "yyyy-MM-dd'S'");
 select to_timestamp("S2019-10-06", "'S'yyyy-MM-dd");
 
-select date_format(timestamp '2019-10-06', 'yyyy-MM-dd uuee');
-select date_format(timestamp '2019-10-06', 'yyyy-MM-dd uucc');
-select date_format(timestamp '2019-10-06', 'yyyy-MM-dd uuuu');
-
 select to_timestamp("2019-10-06T10:11:12'12", "yyyy-MM-dd'T'HH:mm:ss''SSSS"); 
-- middle
 select to_timestamp("2019-10-06T10:11:12'", "yyyy-MM-dd'T'HH:mm:ss''"); -- tail
 select to_timestamp("'2019-10-06T10:11:12", "''yyyy-MM-dd'T'HH:mm:ss"); -- head
diff --git 
a/sql/core/src/test/resources/sql-tests/results/ansi/datetime.sql.out 
b/sql/core/src/test/resources/sql-tests/results/ansi/datetime.sql.out
index 43fe0a6..1cb1aaa 100644
--- a/sql/core/src/test/resources/sql-tests/results/ansi/datetime.sql.out
+++ b/sql/core/src/test/resources/sql-tests/results/ansi/datetime.sql.out
@@ -1,5 +1,5 @@
 -- Automatically generated by SQLQueryTestSuite
--- Number of queries: 112
+-- Number of queries: 109
 
 
 -- !query
@@ -675,32 +675,6 @@ struct<to_timestamp('S2019-10-06', 
'\'S\'yyyy-MM-dd'):timestamp>
 
 
 -- !query
-select date_format(timestamp '2019-10-06', 'yyyy-MM-dd uuee')
--- !query schema
-struct<>
--- !query output
-java.lang.IllegalArgumentException
-Illegal pattern character: e
-
-
--- !query
-select date_format(timestamp '2019-10-06', 'yyyy-MM-dd uucc')
--- !query schema
-struct<>
--- !query output
-java.lang.IllegalArgumentException
-Illegal pattern character: c
-
-
--- !query
-select date_format(timestamp '2019-10-06', 'yyyy-MM-dd uuuu')
--- !query schema
-struct<date_format(TIMESTAMP '2019-10-06 00:00:00', yyyy-MM-dd uuuu):string>
--- !query output
-2019-10-06 Sunday
-
-
--- !query
 select to_timestamp("2019-10-06T10:11:12'12", "yyyy-MM-dd'T'HH:mm:ss''SSSS")
 -- !query schema
 struct<to_timestamp('2019-10-06T10:11:12\'12', 
'yyyy-MM-dd\'T\'HH:mm:ss\'\'SSSS'):timestamp>
diff --git 
a/sql/core/src/test/resources/sql-tests/results/datetime-formatting-invalid.sql.out
 
b/sql/core/src/test/resources/sql-tests/results/datetime-formatting-invalid.sql.out
new file mode 100644
index 0000000..248157e
--- /dev/null
+++ 
b/sql/core/src/test/resources/sql-tests/results/datetime-formatting-invalid.sql.out
@@ -0,0 +1,335 @@
+-- Automatically generated by SQLQueryTestSuite
+-- Number of queries: 37
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'GGGGG')
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.SparkUpgradeException
+You may get a different result due to the upgrading of Spark 3.0: Fail to 
recognize 'GGGGG' pattern in the DateTimeFormatter. 1) You can set 
spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before 
Spark 3.0. 2) You can form a valid datetime pattern with the guide from 
https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'yyyyyyyyyyy')
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.SparkUpgradeException
+You may get a different result due to the upgrading of Spark 3.0: Fail to 
recognize 'yyyyyyyyyyy' pattern in the DateTimeFormatter. 1) You can set 
spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before 
Spark 3.0. 2) You can form a valid datetime pattern with the guide from 
https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'qqqqq')
+-- !query schema
+struct<>
+-- !query output
+java.lang.IllegalArgumentException
+Too many pattern letters: q
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'QQQQQ')
+-- !query schema
+struct<>
+-- !query output
+java.lang.IllegalArgumentException
+Too many pattern letters: Q
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'MMMMM')
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.SparkUpgradeException
+You may get a different result due to the upgrading of Spark 3.0: Fail to 
recognize 'MMMMM' pattern in the DateTimeFormatter. 1) You can set 
spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before 
Spark 3.0. 2) You can form a valid datetime pattern with the guide from 
https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'LLLLL')
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.SparkUpgradeException
+You may get a different result due to the upgrading of Spark 3.0: Fail to 
recognize 'LLLLL' pattern in the DateTimeFormatter. 1) You can set 
spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before 
Spark 3.0. 2) You can form a valid datetime pattern with the guide from 
https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'EEEEE')
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.SparkUpgradeException
+You may get a different result due to the upgrading of Spark 3.0: Fail to 
recognize 'EEEEE' pattern in the DateTimeFormatter. 1) You can set 
spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before 
Spark 3.0. 2) You can form a valid datetime pattern with the guide from 
https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'FF')
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.SparkUpgradeException
+You may get a different result due to the upgrading of Spark 3.0: Fail to 
recognize 'FF' pattern in the DateTimeFormatter. 1) You can set 
spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before 
Spark 3.0. 2) You can form a valid datetime pattern with the guide from 
https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'ddd')
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.SparkUpgradeException
+You may get a different result due to the upgrading of Spark 3.0: Fail to 
recognize 'ddd' pattern in the DateTimeFormatter. 1) You can set 
spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before 
Spark 3.0. 2) You can form a valid datetime pattern with the guide from 
https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'DDDD')
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.SparkUpgradeException
+You may get a different result due to the upgrading of Spark 3.0: Fail to 
recognize 'DDDD' pattern in the DateTimeFormatter. 1) You can set 
spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before 
Spark 3.0. 2) You can form a valid datetime pattern with the guide from 
https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'HHH')
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.SparkUpgradeException
+You may get a different result due to the upgrading of Spark 3.0: Fail to 
recognize 'HHH' pattern in the DateTimeFormatter. 1) You can set 
spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before 
Spark 3.0. 2) You can form a valid datetime pattern with the guide from 
https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'hhh')
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.SparkUpgradeException
+You may get a different result due to the upgrading of Spark 3.0: Fail to 
recognize 'hhh' pattern in the DateTimeFormatter. 1) You can set 
spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before 
Spark 3.0. 2) You can form a valid datetime pattern with the guide from 
https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'kkk')
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.SparkUpgradeException
+You may get a different result due to the upgrading of Spark 3.0: Fail to 
recognize 'kkk' pattern in the DateTimeFormatter. 1) You can set 
spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before 
Spark 3.0. 2) You can form a valid datetime pattern with the guide from 
https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'KKK')
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.SparkUpgradeException
+You may get a different result due to the upgrading of Spark 3.0: Fail to 
recognize 'KKK' pattern in the DateTimeFormatter. 1) You can set 
spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before 
Spark 3.0. 2) You can form a valid datetime pattern with the guide from 
https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'mmm')
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.SparkUpgradeException
+You may get a different result due to the upgrading of Spark 3.0: Fail to 
recognize 'mmm' pattern in the DateTimeFormatter. 1) You can set 
spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before 
Spark 3.0. 2) You can form a valid datetime pattern with the guide from 
https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'sss')
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.SparkUpgradeException
+You may get a different result due to the upgrading of Spark 3.0: Fail to 
recognize 'sss' pattern in the DateTimeFormatter. 1) You can set 
spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before 
Spark 3.0. 2) You can form a valid datetime pattern with the guide from 
https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'SSSSSSSSSS')
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.SparkUpgradeException
+You may get a different result due to the upgrading of Spark 3.0: Fail to 
recognize 'SSSSSSSSSS' pattern in the DateTimeFormatter. 1) You can set 
spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before 
Spark 3.0. 2) You can form a valid datetime pattern with the guide from 
https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'aa')
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.SparkUpgradeException
+You may get a different result due to the upgrading of Spark 3.0: Fail to 
recognize 'aa' pattern in the DateTimeFormatter. 1) You can set 
spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before 
Spark 3.0. 2) You can form a valid datetime pattern with the guide from 
https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'V')
+-- !query schema
+struct<>
+-- !query output
+java.lang.IllegalArgumentException
+Pattern letter count must be 2: V
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'zzzzz')
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.SparkUpgradeException
+You may get a different result due to the upgrading of Spark 3.0: Fail to 
recognize 'zzzzz' pattern in the DateTimeFormatter. 1) You can set 
spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before 
Spark 3.0. 2) You can form a valid datetime pattern with the guide from 
https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'XXXXXX')
+-- !query schema
+struct<>
+-- !query output
+java.lang.IllegalArgumentException
+Too many pattern letters: X
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'ZZZZZZ')
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.SparkUpgradeException
+You may get a different result due to the upgrading of Spark 3.0: Fail to 
recognize 'ZZZZZZ' pattern in the DateTimeFormatter. 1) You can set 
spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before 
Spark 3.0. 2) You can form a valid datetime pattern with the guide from 
https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'OO')
+-- !query schema
+struct<>
+-- !query output
+java.lang.IllegalArgumentException
+Pattern letter count must be 1 or 4: O
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'xxxxxx')
+-- !query schema
+struct<>
+-- !query output
+java.lang.IllegalArgumentException
+Too many pattern letters: x
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'A')
+-- !query schema
+struct<>
+-- !query output
+java.lang.IllegalArgumentException
+Illegal pattern character: A
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'n')
+-- !query schema
+struct<>
+-- !query output
+java.lang.IllegalArgumentException
+Illegal pattern character: n
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'N')
+-- !query schema
+struct<>
+-- !query output
+java.lang.IllegalArgumentException
+Illegal pattern character: N
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'p')
+-- !query schema
+struct<>
+-- !query output
+java.lang.IllegalArgumentException
+Illegal pattern character: p
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'Y')
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.SparkUpgradeException
+You may get a different result due to the upgrading of Spark 3.0: Fail to 
recognize 'Y' pattern in the DateTimeFormatter. 1) You can set 
spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before 
Spark 3.0. 2) You can form a valid datetime pattern with the guide from 
https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'w')
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.SparkUpgradeException
+You may get a different result due to the upgrading of Spark 3.0: Fail to 
recognize 'w' pattern in the DateTimeFormatter. 1) You can set 
spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before 
Spark 3.0. 2) You can form a valid datetime pattern with the guide from 
https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'W')
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.SparkUpgradeException
+You may get a different result due to the upgrading of Spark 3.0: Fail to 
recognize 'W' pattern in the DateTimeFormatter. 1) You can set 
spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before 
Spark 3.0. 2) You can form a valid datetime pattern with the guide from 
https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'u')
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.SparkUpgradeException
+You may get a different result due to the upgrading of Spark 3.0: Fail to 
recognize 'u' pattern in the DateTimeFormatter. 1) You can set 
spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before 
Spark 3.0. 2) You can form a valid datetime pattern with the guide from 
https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'e')
+-- !query schema
+struct<>
+-- !query output
+java.lang.IllegalArgumentException
+All week-based patterns are unsupported since Spark 3.0, detected: e, Please 
use the SQL function EXTRACT instead
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'c')
+-- !query schema
+struct<>
+-- !query output
+java.lang.IllegalArgumentException
+All week-based patterns are unsupported since Spark 3.0, detected: c, Please 
use the SQL function EXTRACT instead
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'B')
+-- !query schema
+struct<>
+-- !query output
+java.lang.IllegalArgumentException
+Unknown pattern letter: B
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'C')
+-- !query schema
+struct<>
+-- !query output
+java.lang.IllegalArgumentException
+Unknown pattern letter: C
+
+
+-- !query
+select date_format('2018-11-17 13:33:33.333', 'I')
+-- !query schema
+struct<>
+-- !query output
+java.lang.IllegalArgumentException
+Unknown pattern letter: I
diff --git 
a/sql/core/src/test/resources/sql-tests/results/datetime-formatting-legacy.sql.out
 
b/sql/core/src/test/resources/sql-tests/results/datetime-formatting-legacy.sql.out
new file mode 100644
index 0000000..b7bc448
--- /dev/null
+++ 
b/sql/core/src/test/resources/sql-tests/results/datetime-formatting-legacy.sql.out
@@ -0,0 +1,401 @@
+-- Automatically generated by SQLQueryTestSuite
+-- Number of queries: 31
+
+
+-- !query
+create temporary view v as select col from values
+ (timestamp '1582-06-01 11:33:33.123UTC+080000'),
+ (timestamp '1970-01-01 00:00:00.000Europe/Paris'),
+ (timestamp '1970-12-31 23:59:59.999Asia/Srednekolymsk'),
+ (timestamp '1996-04-01 00:33:33.123Australia/Darwin'),
+ (timestamp '2018-11-17 13:33:33.123Z'),
+ (timestamp '2020-01-01 01:33:33.123Asia/Shanghai'),
+ (timestamp '2100-01-01 01:33:33.123America/Los_Angeles') t(col)
+-- !query schema
+struct<>
+-- !query output
+
+
+
+-- !query
+select col, date_format(col, 'G GG GGG GGGG') from v
+-- !query schema
+struct<col:timestamp,date_format(col, G GG GGG GGGG):string>
+-- !query output
+1582-05-31 19:40:35.123        AD AD AD AD
+1969-12-31 15:00:00    AD AD AD AD
+1970-12-31 04:59:59.999        AD AD AD AD
+1996-03-31 07:03:33.123        AD AD AD AD
+2018-11-17 05:33:33.123        AD AD AD AD
+2019-12-31 09:33:33.123        AD AD AD AD
+2100-01-01 01:33:33.123        AD AD AD AD
+
+
+-- !query
+select col, date_format(col, 'y yy yyy yyyy yyyyy yyyyyy yyyyyyy yyyyyyyy 
yyyyyyyyy yyyyyyyyyy') from v
+-- !query schema
+struct<col:timestamp,date_format(col, y yy yyy yyyy yyyyy yyyyyy yyyyyyy 
yyyyyyyy yyyyyyyyy yyyyyyyyyy):string>
+-- !query output
+1582-05-31 19:40:35.123        1582 82 1582 1582 01582 001582 0001582 00001582 
000001582 0000001582
+1969-12-31 15:00:00    1969 69 1969 1969 01969 001969 0001969 00001969 
000001969 0000001969
+1970-12-31 04:59:59.999        1970 70 1970 1970 01970 001970 0001970 00001970 
000001970 0000001970
+1996-03-31 07:03:33.123        1996 96 1996 1996 01996 001996 0001996 00001996 
000001996 0000001996
+2018-11-17 05:33:33.123        2018 18 2018 2018 02018 002018 0002018 00002018 
000002018 0000002018
+2019-12-31 09:33:33.123        2019 19 2019 2019 02019 002019 0002019 00002019 
000002019 0000002019
+2100-01-01 01:33:33.123        2100 00 2100 2100 02100 002100 0002100 00002100 
000002100 0000002100
+
+
+-- !query
+select col, date_format(col, 'q qq') from v
+-- !query schema
+struct<>
+-- !query output
+java.lang.IllegalArgumentException
+Illegal pattern character 'q'
+
+
+-- !query
+select col, date_format(col, 'Q QQ QQQ QQQQ') from v
+-- !query schema
+struct<>
+-- !query output
+java.lang.IllegalArgumentException
+Illegal pattern character 'Q'
+
+
+-- !query
+select col, date_format(col, 'M MM MMM MMMM') from v
+-- !query schema
+struct<col:timestamp,date_format(col, M MM MMM MMMM):string>
+-- !query output
+1582-05-31 19:40:35.123        5 05 May May
+1969-12-31 15:00:00    12 12 Dec December
+1970-12-31 04:59:59.999        12 12 Dec December
+1996-03-31 07:03:33.123        3 03 Mar March
+2018-11-17 05:33:33.123        11 11 Nov November
+2019-12-31 09:33:33.123        12 12 Dec December
+2100-01-01 01:33:33.123        1 01 Jan January
+
+
+-- !query
+select col, date_format(col, 'L LL') from v
+-- !query schema
+struct<col:timestamp,date_format(col, L LL):string>
+-- !query output
+1582-05-31 19:40:35.123        5 05
+1969-12-31 15:00:00    12 12
+1970-12-31 04:59:59.999        12 12
+1996-03-31 07:03:33.123        3 03
+2018-11-17 05:33:33.123        11 11
+2019-12-31 09:33:33.123        12 12
+2100-01-01 01:33:33.123        1 01
+
+
+-- !query
+select col, date_format(col, 'E EE EEE EEEE') from v
+-- !query schema
+struct<col:timestamp,date_format(col, E EE EEE EEEE):string>
+-- !query output
+1582-05-31 19:40:35.123        Thu Thu Thu Thursday
+1969-12-31 15:00:00    Wed Wed Wed Wednesday
+1970-12-31 04:59:59.999        Thu Thu Thu Thursday
+1996-03-31 07:03:33.123        Sun Sun Sun Sunday
+2018-11-17 05:33:33.123        Sat Sat Sat Saturday
+2019-12-31 09:33:33.123        Tue Tue Tue Tuesday
+2100-01-01 01:33:33.123        Fri Fri Fri Friday
+
+
+-- !query
+select col, date_format(col, 'F') from v
+-- !query schema
+struct<col:timestamp,date_format(col, F):string>
+-- !query output
+1582-05-31 19:40:35.123        5
+1969-12-31 15:00:00    5
+1970-12-31 04:59:59.999        5
+1996-03-31 07:03:33.123        5
+2018-11-17 05:33:33.123        3
+2019-12-31 09:33:33.123        5
+2100-01-01 01:33:33.123        1
+
+
+-- !query
+select col, date_format(col, 'd dd') from v
+-- !query schema
+struct<col:timestamp,date_format(col, d dd):string>
+-- !query output
+1582-05-31 19:40:35.123        31 31
+1969-12-31 15:00:00    31 31
+1970-12-31 04:59:59.999        31 31
+1996-03-31 07:03:33.123        31 31
+2018-11-17 05:33:33.123        17 17
+2019-12-31 09:33:33.123        31 31
+2100-01-01 01:33:33.123        1 01
+
+
+-- !query
+select col, date_format(col, 'DD') from v where col = timestamp '2100-01-01 
01:33:33.123America/Los_Angeles'
+-- !query schema
+struct<col:timestamp,date_format(col, DD):string>
+-- !query output
+2100-01-01 01:33:33.123        01
+
+
+-- !query
+select col, date_format(col, 'D DDD') from v
+-- !query schema
+struct<col:timestamp,date_format(col, D DDD):string>
+-- !query output
+1582-05-31 19:40:35.123        151 151
+1969-12-31 15:00:00    365 365
+1970-12-31 04:59:59.999        365 365
+1996-03-31 07:03:33.123        91 091
+2018-11-17 05:33:33.123        321 321
+2019-12-31 09:33:33.123        365 365
+2100-01-01 01:33:33.123        1 001
+
+
+-- !query
+select col, date_format(col, 'H HH') from v
+-- !query schema
+struct<col:timestamp,date_format(col, H HH):string>
+-- !query output
+1582-05-31 19:40:35.123        19 19
+1969-12-31 15:00:00    15 15
+1970-12-31 04:59:59.999        4 04
+1996-03-31 07:03:33.123        7 07
+2018-11-17 05:33:33.123        5 05
+2019-12-31 09:33:33.123        9 09
+2100-01-01 01:33:33.123        1 01
+
+
+-- !query
+select col, date_format(col, 'h hh') from v
+-- !query schema
+struct<col:timestamp,date_format(col, h hh):string>
+-- !query output
+1582-05-31 19:40:35.123        7 07
+1969-12-31 15:00:00    3 03
+1970-12-31 04:59:59.999        4 04
+1996-03-31 07:03:33.123        7 07
+2018-11-17 05:33:33.123        5 05
+2019-12-31 09:33:33.123        9 09
+2100-01-01 01:33:33.123        1 01
+
+
+-- !query
+select col, date_format(col, 'k kk') from v
+-- !query schema
+struct<col:timestamp,date_format(col, k kk):string>
+-- !query output
+1582-05-31 19:40:35.123        19 19
+1969-12-31 15:00:00    15 15
+1970-12-31 04:59:59.999        4 04
+1996-03-31 07:03:33.123        7 07
+2018-11-17 05:33:33.123        5 05
+2019-12-31 09:33:33.123        9 09
+2100-01-01 01:33:33.123        1 01
+
+
+-- !query
+select col, date_format(col, 'K KK') from v
+-- !query schema
+struct<col:timestamp,date_format(col, K KK):string>
+-- !query output
+1582-05-31 19:40:35.123        7 07
+1969-12-31 15:00:00    3 03
+1970-12-31 04:59:59.999        4 04
+1996-03-31 07:03:33.123        7 07
+2018-11-17 05:33:33.123        5 05
+2019-12-31 09:33:33.123        9 09
+2100-01-01 01:33:33.123        1 01
+
+
+-- !query
+select col, date_format(col, 'm mm') from v
+-- !query schema
+struct<col:timestamp,date_format(col, m mm):string>
+-- !query output
+1582-05-31 19:40:35.123        40 40
+1969-12-31 15:00:00    0 00
+1970-12-31 04:59:59.999        59 59
+1996-03-31 07:03:33.123        3 03
+2018-11-17 05:33:33.123        33 33
+2019-12-31 09:33:33.123        33 33
+2100-01-01 01:33:33.123        33 33
+
+
+-- !query
+select col, date_format(col, 's ss') from v
+-- !query schema
+struct<col:timestamp,date_format(col, s ss):string>
+-- !query output
+1582-05-31 19:40:35.123        35 35
+1969-12-31 15:00:00    0 00
+1970-12-31 04:59:59.999        59 59
+1996-03-31 07:03:33.123        33 33
+2018-11-17 05:33:33.123        33 33
+2019-12-31 09:33:33.123        33 33
+2100-01-01 01:33:33.123        33 33
+
+
+-- !query
+select col, date_format(col, 'S SS SSS SSSS SSSSS SSSSSS SSSSSSS SSSSSSSS 
SSSSSSSSS') from v
+-- !query schema
+struct<col:timestamp,date_format(col, S SS SSS SSSS SSSSS SSSSSS SSSSSSS 
SSSSSSSS SSSSSSSSS):string>
+-- !query output
+1582-05-31 19:40:35.123        123 123 123 0123 00123 000123 0000123 00000123 
000000123
+1969-12-31 15:00:00    0 00 000 0000 00000 000000 0000000 00000000 000000000
+1970-12-31 04:59:59.999        999 999 999 0999 00999 000999 0000999 00000999 
000000999
+1996-03-31 07:03:33.123        123 123 123 0123 00123 000123 0000123 00000123 
000000123
+2018-11-17 05:33:33.123        123 123 123 0123 00123 000123 0000123 00000123 
000000123
+2019-12-31 09:33:33.123        123 123 123 0123 00123 000123 0000123 00000123 
000000123
+2100-01-01 01:33:33.123        123 123 123 0123 00123 000123 0000123 00000123 
000000123
+
+
+-- !query
+select col, date_format(col, 'a') from v
+-- !query schema
+struct<col:timestamp,date_format(col, a):string>
+-- !query output
+1582-05-31 19:40:35.123        PM
+1969-12-31 15:00:00    PM
+1970-12-31 04:59:59.999        AM
+1996-03-31 07:03:33.123        AM
+2018-11-17 05:33:33.123        AM
+2019-12-31 09:33:33.123        AM
+2100-01-01 01:33:33.123        AM
+
+
+-- !query
+select col, date_format(col, 'VV') from v
+-- !query schema
+struct<>
+-- !query output
+java.lang.IllegalArgumentException
+Illegal pattern character 'V'
+
+
+-- !query
+select col, date_format(col, 'z zz zzz zzzz') from v
+-- !query schema
+struct<col:timestamp,date_format(col, z zz zzz zzzz):string>
+-- !query output
+1582-05-31 19:40:35.123        PST PST PST Pacific Standard Time
+1969-12-31 15:00:00    PST PST PST Pacific Standard Time
+1970-12-31 04:59:59.999        PST PST PST Pacific Standard Time
+1996-03-31 07:03:33.123        PST PST PST Pacific Standard Time
+2018-11-17 05:33:33.123        PST PST PST Pacific Standard Time
+2019-12-31 09:33:33.123        PST PST PST Pacific Standard Time
+2100-01-01 01:33:33.123        PST PST PST Pacific Standard Time
+
+
+-- !query
+select col, date_format(col, 'X XX XXX') from v
+-- !query schema
+struct<col:timestamp,date_format(col, X XX XXX):string>
+-- !query output
+1582-05-31 19:40:35.123        -08 -0800 -08:00
+1969-12-31 15:00:00    -08 -0800 -08:00
+1970-12-31 04:59:59.999        -08 -0800 -08:00
+1996-03-31 07:03:33.123        -08 -0800 -08:00
+2018-11-17 05:33:33.123        -08 -0800 -08:00
+2019-12-31 09:33:33.123        -08 -0800 -08:00
+2100-01-01 01:33:33.123        -08 -0800 -08:00
+
+
+-- !query
+select col, date_format(col, 'XXXX XXXXX') from v
+-- !query schema
+struct<>
+-- !query output
+java.lang.IllegalArgumentException
+invalid ISO 8601 format: length=4
+
+
+-- !query
+select col, date_format(col, 'Z ZZ ZZZ ZZZZ ZZZZZ') from v
+-- !query schema
+struct<col:timestamp,date_format(col, Z ZZ ZZZ ZZZZ ZZZZZ):string>
+-- !query output
+1582-05-31 19:40:35.123        -0800 -0800 -0800 -0800 -0800
+1969-12-31 15:00:00    -0800 -0800 -0800 -0800 -0800
+1970-12-31 04:59:59.999        -0800 -0800 -0800 -0800 -0800
+1996-03-31 07:03:33.123        -0800 -0800 -0800 -0800 -0800
+2018-11-17 05:33:33.123        -0800 -0800 -0800 -0800 -0800
+2019-12-31 09:33:33.123        -0800 -0800 -0800 -0800 -0800
+2100-01-01 01:33:33.123        -0800 -0800 -0800 -0800 -0800
+
+
+-- !query
+select col, date_format(col, 'O OOOO') from v
+-- !query schema
+struct<>
+-- !query output
+java.lang.IllegalArgumentException
+Illegal pattern character 'O'
+
+
+-- !query
+select col, date_format(col, 'x xx xxx xxxx xxxx xxxxx') from v
+-- !query schema
+struct<>
+-- !query output
+java.lang.IllegalArgumentException
+Illegal pattern character 'x'
+
+
+-- !query
+select col, date_format(col, '[yyyy-MM-dd HH:mm:ss]') from v
+-- !query schema
+struct<col:timestamp,date_format(col, [yyyy-MM-dd HH:mm:ss]):string>
+-- !query output
+1582-05-31 19:40:35.123        [1582-05-31 19:40:35]
+1969-12-31 15:00:00    [1969-12-31 15:00:00]
+1970-12-31 04:59:59.999        [1970-12-31 04:59:59]
+1996-03-31 07:03:33.123        [1996-03-31 07:03:33]
+2018-11-17 05:33:33.123        [2018-11-17 05:33:33]
+2019-12-31 09:33:33.123        [2019-12-31 09:33:33]
+2100-01-01 01:33:33.123        [2100-01-01 01:33:33]
+
+
+-- !query
+select col, date_format(col, "姚123'GyYqQMLwWuEFDdhHmsSaVzZxXOV'") from v
+-- !query schema
+struct<col:timestamp,date_format(col, 
姚123'GyYqQMLwWuEFDdhHmsSaVzZxXOV'):string>
+-- !query output
+1582-05-31 19:40:35.123        姚123GyYqQMLwWuEFDdhHmsSaVzZxXOV
+1969-12-31 15:00:00    姚123GyYqQMLwWuEFDdhHmsSaVzZxXOV
+1970-12-31 04:59:59.999        姚123GyYqQMLwWuEFDdhHmsSaVzZxXOV
+1996-03-31 07:03:33.123        姚123GyYqQMLwWuEFDdhHmsSaVzZxXOV
+2018-11-17 05:33:33.123        姚123GyYqQMLwWuEFDdhHmsSaVzZxXOV
+2019-12-31 09:33:33.123        姚123GyYqQMLwWuEFDdhHmsSaVzZxXOV
+2100-01-01 01:33:33.123        姚123GyYqQMLwWuEFDdhHmsSaVzZxXOV
+
+
+-- !query
+select col, date_format(col, "''") from v
+-- !query schema
+struct<col:timestamp,date_format(col, ''):string>
+-- !query output
+1582-05-31 19:40:35.123        '
+1969-12-31 15:00:00    '
+1970-12-31 04:59:59.999        '
+1996-03-31 07:03:33.123        '
+2018-11-17 05:33:33.123        '
+2019-12-31 09:33:33.123        '
+2100-01-01 01:33:33.123        '
+
+
+-- !query
+select col, date_format(col, '') from v
+-- !query schema
+struct<col:timestamp,date_format(col, ):string>
+-- !query output
+1582-05-31 19:40:35.123        
+1969-12-31 15:00:00    
+1970-12-31 04:59:59.999        
+1996-03-31 07:03:33.123        
+2018-11-17 05:33:33.123        
+2019-12-31 09:33:33.123        
+2100-01-01 01:33:33.123
diff --git 
a/sql/core/src/test/resources/sql-tests/results/datetime-formatting.sql.out 
b/sql/core/src/test/resources/sql-tests/results/datetime-formatting.sql.out
new file mode 100644
index 0000000..f724658
--- /dev/null
+++ b/sql/core/src/test/resources/sql-tests/results/datetime-formatting.sql.out
@@ -0,0 +1,431 @@
+-- Automatically generated by SQLQueryTestSuite
+-- Number of queries: 31
+
+
+-- !query
+create temporary view v as select col from values
+ (timestamp '1582-06-01 11:33:33.123UTC+080000'),
+ (timestamp '1970-01-01 00:00:00.000Europe/Paris'),
+ (timestamp '1970-12-31 23:59:59.999Asia/Srednekolymsk'),
+ (timestamp '1996-04-01 00:33:33.123Australia/Darwin'),
+ (timestamp '2018-11-17 13:33:33.123Z'),
+ (timestamp '2020-01-01 01:33:33.123Asia/Shanghai'),
+ (timestamp '2100-01-01 01:33:33.123America/Los_Angeles') t(col)
+-- !query schema
+struct<>
+-- !query output
+
+
+
+-- !query
+select col, date_format(col, 'G GG GGG GGGG') from v
+-- !query schema
+struct<col:timestamp,date_format(col, G GG GGG GGGG):string>
+-- !query output
+1582-05-31 19:40:35.123        AD AD AD Anno Domini
+1969-12-31 15:00:00    AD AD AD Anno Domini
+1970-12-31 04:59:59.999        AD AD AD Anno Domini
+1996-03-31 07:03:33.123        AD AD AD Anno Domini
+2018-11-17 05:33:33.123        AD AD AD Anno Domini
+2019-12-31 09:33:33.123        AD AD AD Anno Domini
+2100-01-01 01:33:33.123        AD AD AD Anno Domini
+
+
+-- !query
+select col, date_format(col, 'y yy yyy yyyy yyyyy yyyyyy yyyyyyy yyyyyyyy 
yyyyyyyyy yyyyyyyyyy') from v
+-- !query schema
+struct<col:timestamp,date_format(col, y yy yyy yyyy yyyyy yyyyyy yyyyyyy 
yyyyyyyy yyyyyyyyy yyyyyyyyyy):string>
+-- !query output
+1582-05-31 19:40:35.123        1582 82 1582 1582 01582 001582 0001582 00001582 
000001582 0000001582
+1969-12-31 15:00:00    1969 69 1969 1969 01969 001969 0001969 00001969 
000001969 0000001969
+1970-12-31 04:59:59.999        1970 70 1970 1970 01970 001970 0001970 00001970 
000001970 0000001970
+1996-03-31 07:03:33.123        1996 96 1996 1996 01996 001996 0001996 00001996 
000001996 0000001996
+2018-11-17 05:33:33.123        2018 18 2018 2018 02018 002018 0002018 00002018 
000002018 0000002018
+2019-12-31 09:33:33.123        2019 19 2019 2019 02019 002019 0002019 00002019 
000002019 0000002019
+2100-01-01 01:33:33.123        2100 00 2100 2100 02100 002100 0002100 00002100 
000002100 0000002100
+
+
+-- !query
+select col, date_format(col, 'q qq') from v
+-- !query schema
+struct<col:timestamp,date_format(col, q qq):string>
+-- !query output
+1582-05-31 19:40:35.123        2 02
+1969-12-31 15:00:00    4 04
+1970-12-31 04:59:59.999        4 04
+1996-03-31 07:03:33.123        1 01
+2018-11-17 05:33:33.123        4 04
+2019-12-31 09:33:33.123        4 04
+2100-01-01 01:33:33.123        1 01
+
+
+-- !query
+select col, date_format(col, 'Q QQ QQQ QQQQ') from v
+-- !query schema
+struct<col:timestamp,date_format(col, Q QQ QQQ QQQQ):string>
+-- !query output
+1582-05-31 19:40:35.123        2 02 Q2 2nd quarter
+1969-12-31 15:00:00    4 04 Q4 4th quarter
+1970-12-31 04:59:59.999        4 04 Q4 4th quarter
+1996-03-31 07:03:33.123        1 01 Q1 1st quarter
+2018-11-17 05:33:33.123        4 04 Q4 4th quarter
+2019-12-31 09:33:33.123        4 04 Q4 4th quarter
+2100-01-01 01:33:33.123        1 01 Q1 1st quarter
+
+
+-- !query
+select col, date_format(col, 'M MM MMM MMMM') from v
+-- !query schema
+struct<col:timestamp,date_format(col, M MM MMM MMMM):string>
+-- !query output
+1582-05-31 19:40:35.123        5 05 May May
+1969-12-31 15:00:00    12 12 Dec December
+1970-12-31 04:59:59.999        12 12 Dec December
+1996-03-31 07:03:33.123        3 03 Mar March
+2018-11-17 05:33:33.123        11 11 Nov November
+2019-12-31 09:33:33.123        12 12 Dec December
+2100-01-01 01:33:33.123        1 01 Jan January
+
+
+-- !query
+select col, date_format(col, 'L LL') from v
+-- !query schema
+struct<col:timestamp,date_format(col, L LL):string>
+-- !query output
+1582-05-31 19:40:35.123        5 05
+1969-12-31 15:00:00    12 12
+1970-12-31 04:59:59.999        12 12
+1996-03-31 07:03:33.123        3 03
+2018-11-17 05:33:33.123        11 11
+2019-12-31 09:33:33.123        12 12
+2100-01-01 01:33:33.123        1 01
+
+
+-- !query
+select col, date_format(col, 'E EE EEE EEEE') from v
+-- !query schema
+struct<col:timestamp,date_format(col, E EE EEE EEEE):string>
+-- !query output
+1582-05-31 19:40:35.123        Mon Mon Mon Monday
+1969-12-31 15:00:00    Wed Wed Wed Wednesday
+1970-12-31 04:59:59.999        Thu Thu Thu Thursday
+1996-03-31 07:03:33.123        Sun Sun Sun Sunday
+2018-11-17 05:33:33.123        Sat Sat Sat Saturday
+2019-12-31 09:33:33.123        Tue Tue Tue Tuesday
+2100-01-01 01:33:33.123        Fri Fri Fri Friday
+
+
+-- !query
+select col, date_format(col, 'F') from v
+-- !query schema
+struct<col:timestamp,date_format(col, F):string>
+-- !query output
+1582-05-31 19:40:35.123        3
+1969-12-31 15:00:00    3
+1970-12-31 04:59:59.999        3
+1996-03-31 07:03:33.123        3
+2018-11-17 05:33:33.123        3
+2019-12-31 09:33:33.123        3
+2100-01-01 01:33:33.123        1
+
+
+-- !query
+select col, date_format(col, 'd dd') from v
+-- !query schema
+struct<col:timestamp,date_format(col, d dd):string>
+-- !query output
+1582-05-31 19:40:35.123        31 31
+1969-12-31 15:00:00    31 31
+1970-12-31 04:59:59.999        31 31
+1996-03-31 07:03:33.123        31 31
+2018-11-17 05:33:33.123        17 17
+2019-12-31 09:33:33.123        31 31
+2100-01-01 01:33:33.123        1 01
+
+
+-- !query
+select col, date_format(col, 'DD') from v where col = timestamp '2100-01-01 
01:33:33.123America/Los_Angeles'
+-- !query schema
+struct<col:timestamp,date_format(col, DD):string>
+-- !query output
+2100-01-01 01:33:33.123        01
+
+
+-- !query
+select col, date_format(col, 'D DDD') from v
+-- !query schema
+struct<col:timestamp,date_format(col, D DDD):string>
+-- !query output
+1582-05-31 19:40:35.123        151 151
+1969-12-31 15:00:00    365 365
+1970-12-31 04:59:59.999        365 365
+1996-03-31 07:03:33.123        91 091
+2018-11-17 05:33:33.123        321 321
+2019-12-31 09:33:33.123        365 365
+2100-01-01 01:33:33.123        1 001
+
+
+-- !query
+select col, date_format(col, 'H HH') from v
+-- !query schema
+struct<col:timestamp,date_format(col, H HH):string>
+-- !query output
+1582-05-31 19:40:35.123        19 19
+1969-12-31 15:00:00    15 15
+1970-12-31 04:59:59.999        4 04
+1996-03-31 07:03:33.123        7 07
+2018-11-17 05:33:33.123        5 05
+2019-12-31 09:33:33.123        9 09
+2100-01-01 01:33:33.123        1 01
+
+
+-- !query
+select col, date_format(col, 'h hh') from v
+-- !query schema
+struct<col:timestamp,date_format(col, h hh):string>
+-- !query output
+1582-05-31 19:40:35.123        7 07
+1969-12-31 15:00:00    3 03
+1970-12-31 04:59:59.999        4 04
+1996-03-31 07:03:33.123        7 07
+2018-11-17 05:33:33.123        5 05
+2019-12-31 09:33:33.123        9 09
+2100-01-01 01:33:33.123        1 01
+
+
+-- !query
+select col, date_format(col, 'k kk') from v
+-- !query schema
+struct<col:timestamp,date_format(col, k kk):string>
+-- !query output
+1582-05-31 19:40:35.123        19 19
+1969-12-31 15:00:00    15 15
+1970-12-31 04:59:59.999        4 04
+1996-03-31 07:03:33.123        7 07
+2018-11-17 05:33:33.123        5 05
+2019-12-31 09:33:33.123        9 09
+2100-01-01 01:33:33.123        1 01
+
+
+-- !query
+select col, date_format(col, 'K KK') from v
+-- !query schema
+struct<col:timestamp,date_format(col, K KK):string>
+-- !query output
+1582-05-31 19:40:35.123        7 07
+1969-12-31 15:00:00    3 03
+1970-12-31 04:59:59.999        4 04
+1996-03-31 07:03:33.123        7 07
+2018-11-17 05:33:33.123        5 05
+2019-12-31 09:33:33.123        9 09
+2100-01-01 01:33:33.123        1 01
+
+
+-- !query
+select col, date_format(col, 'm mm') from v
+-- !query schema
+struct<col:timestamp,date_format(col, m mm):string>
+-- !query output
+1582-05-31 19:40:35.123        40 40
+1969-12-31 15:00:00    0 00
+1970-12-31 04:59:59.999        59 59
+1996-03-31 07:03:33.123        3 03
+2018-11-17 05:33:33.123        33 33
+2019-12-31 09:33:33.123        33 33
+2100-01-01 01:33:33.123        33 33
+
+
+-- !query
+select col, date_format(col, 's ss') from v
+-- !query schema
+struct<col:timestamp,date_format(col, s ss):string>
+-- !query output
+1582-05-31 19:40:35.123        35 35
+1969-12-31 15:00:00    0 00
+1970-12-31 04:59:59.999        59 59
+1996-03-31 07:03:33.123        33 33
+2018-11-17 05:33:33.123        33 33
+2019-12-31 09:33:33.123        33 33
+2100-01-01 01:33:33.123        33 33
+
+
+-- !query
+select col, date_format(col, 'S SS SSS SSSS SSSSS SSSSSS SSSSSSS SSSSSSSS 
SSSSSSSSS') from v
+-- !query schema
+struct<col:timestamp,date_format(col, S SS SSS SSSS SSSSS SSSSSS SSSSSSS 
SSSSSSSS SSSSSSSSS):string>
+-- !query output
+1582-05-31 19:40:35.123        1 12 123 1230 12300 123000 1230000 12300000 
123000000
+1969-12-31 15:00:00    0 00 000 0000 00000 000000 0000000 00000000 000000000
+1970-12-31 04:59:59.999        9 99 999 9990 99900 999000 9990000 99900000 
999000000
+1996-03-31 07:03:33.123        1 12 123 1230 12300 123000 1230000 12300000 
123000000
+2018-11-17 05:33:33.123        1 12 123 1230 12300 123000 1230000 12300000 
123000000
+2019-12-31 09:33:33.123        1 12 123 1230 12300 123000 1230000 12300000 
123000000
+2100-01-01 01:33:33.123        1 12 123 1230 12300 123000 1230000 12300000 
123000000
+
+
+-- !query
+select col, date_format(col, 'a') from v
+-- !query schema
+struct<col:timestamp,date_format(col, a):string>
+-- !query output
+1582-05-31 19:40:35.123        PM
+1969-12-31 15:00:00    PM
+1970-12-31 04:59:59.999        AM
+1996-03-31 07:03:33.123        AM
+2018-11-17 05:33:33.123        AM
+2019-12-31 09:33:33.123        AM
+2100-01-01 01:33:33.123        AM
+
+
+-- !query
+select col, date_format(col, 'VV') from v
+-- !query schema
+struct<col:timestamp,date_format(col, VV):string>
+-- !query output
+1582-05-31 19:40:35.123        America/Los_Angeles
+1969-12-31 15:00:00    America/Los_Angeles
+1970-12-31 04:59:59.999        America/Los_Angeles
+1996-03-31 07:03:33.123        America/Los_Angeles
+2018-11-17 05:33:33.123        America/Los_Angeles
+2019-12-31 09:33:33.123        America/Los_Angeles
+2100-01-01 01:33:33.123        America/Los_Angeles
+
+
+-- !query
+select col, date_format(col, 'z zz zzz zzzz') from v
+-- !query schema
+struct<col:timestamp,date_format(col, z zz zzz zzzz):string>
+-- !query output
+1582-05-31 19:40:35.123        PST PST PST Pacific Standard Time
+1969-12-31 15:00:00    PST PST PST Pacific Standard Time
+1970-12-31 04:59:59.999        PST PST PST Pacific Standard Time
+1996-03-31 07:03:33.123        PST PST PST Pacific Standard Time
+2018-11-17 05:33:33.123        PST PST PST Pacific Standard Time
+2019-12-31 09:33:33.123        PST PST PST Pacific Standard Time
+2100-01-01 01:33:33.123        PST PST PST Pacific Standard Time
+
+
+-- !query
+select col, date_format(col, 'X XX XXX') from v
+-- !query schema
+struct<col:timestamp,date_format(col, X XX XXX):string>
+-- !query output
+1582-05-31 19:40:35.123        -0752 -0752 -07:52
+1969-12-31 15:00:00    -08 -0800 -08:00
+1970-12-31 04:59:59.999        -08 -0800 -08:00
+1996-03-31 07:03:33.123        -08 -0800 -08:00
+2018-11-17 05:33:33.123        -08 -0800 -08:00
+2019-12-31 09:33:33.123        -08 -0800 -08:00
+2100-01-01 01:33:33.123        -08 -0800 -08:00
+
+
+-- !query
+select col, date_format(col, 'XXXX XXXXX') from v
+-- !query schema
+struct<col:timestamp,date_format(col, XXXX XXXXX):string>
+-- !query output
+1582-05-31 19:40:35.123        -075258 -07:52:58
+1969-12-31 15:00:00    -0800 -08:00
+1970-12-31 04:59:59.999        -0800 -08:00
+1996-03-31 07:03:33.123        -0800 -08:00
+2018-11-17 05:33:33.123        -0800 -08:00
+2019-12-31 09:33:33.123        -0800 -08:00
+2100-01-01 01:33:33.123        -0800 -08:00
+
+
+-- !query
+select col, date_format(col, 'Z ZZ ZZZ ZZZZ ZZZZZ') from v
+-- !query schema
+struct<col:timestamp,date_format(col, Z ZZ ZZZ ZZZZ ZZZZZ):string>
+-- !query output
+1582-05-31 19:40:35.123        -0752 -0752 -0752 GMT-07:52:58 -07:52:58
+1969-12-31 15:00:00    -0800 -0800 -0800 GMT-08:00 -08:00
+1970-12-31 04:59:59.999        -0800 -0800 -0800 GMT-08:00 -08:00
+1996-03-31 07:03:33.123        -0800 -0800 -0800 GMT-08:00 -08:00
+2018-11-17 05:33:33.123        -0800 -0800 -0800 GMT-08:00 -08:00
+2019-12-31 09:33:33.123        -0800 -0800 -0800 GMT-08:00 -08:00
+2100-01-01 01:33:33.123        -0800 -0800 -0800 GMT-08:00 -08:00
+
+
+-- !query
+select col, date_format(col, 'O OOOO') from v
+-- !query schema
+struct<col:timestamp,date_format(col, O OOOO):string>
+-- !query output
+1582-05-31 19:40:35.123        GMT-7:52:58 GMT-07:52:58
+1969-12-31 15:00:00    GMT-8 GMT-08:00
+1970-12-31 04:59:59.999        GMT-8 GMT-08:00
+1996-03-31 07:03:33.123        GMT-8 GMT-08:00
+2018-11-17 05:33:33.123        GMT-8 GMT-08:00
+2019-12-31 09:33:33.123        GMT-8 GMT-08:00
+2100-01-01 01:33:33.123        GMT-8 GMT-08:00
+
+
+-- !query
+select col, date_format(col, 'x xx xxx xxxx xxxx xxxxx') from v
+-- !query schema
+struct<col:timestamp,date_format(col, x xx xxx xxxx xxxx xxxxx):string>
+-- !query output
+1582-05-31 19:40:35.123        -0752 -0752 -07:52 -075258 -075258 -07:52:58
+1969-12-31 15:00:00    -08 -0800 -08:00 -0800 -0800 -08:00
+1970-12-31 04:59:59.999        -08 -0800 -08:00 -0800 -0800 -08:00
+1996-03-31 07:03:33.123        -08 -0800 -08:00 -0800 -0800 -08:00
+2018-11-17 05:33:33.123        -08 -0800 -08:00 -0800 -0800 -08:00
+2019-12-31 09:33:33.123        -08 -0800 -08:00 -0800 -0800 -08:00
+2100-01-01 01:33:33.123        -08 -0800 -08:00 -0800 -0800 -08:00
+
+
+-- !query
+select col, date_format(col, '[yyyy-MM-dd HH:mm:ss]') from v
+-- !query schema
+struct<col:timestamp,date_format(col, [yyyy-MM-dd HH:mm:ss]):string>
+-- !query output
+1582-05-31 19:40:35.123        1582-05-31 19:40:35
+1969-12-31 15:00:00    1969-12-31 15:00:00
+1970-12-31 04:59:59.999        1970-12-31 04:59:59
+1996-03-31 07:03:33.123        1996-03-31 07:03:33
+2018-11-17 05:33:33.123        2018-11-17 05:33:33
+2019-12-31 09:33:33.123        2019-12-31 09:33:33
+2100-01-01 01:33:33.123        2100-01-01 01:33:33
+
+
+-- !query
+select col, date_format(col, "姚123'GyYqQMLwWuEFDdhHmsSaVzZxXOV'") from v
+-- !query schema
+struct<col:timestamp,date_format(col, 
姚123'GyYqQMLwWuEFDdhHmsSaVzZxXOV'):string>
+-- !query output
+1582-05-31 19:40:35.123        姚123GyYqQMLwWuEFDdhHmsSaVzZxXOV
+1969-12-31 15:00:00    姚123GyYqQMLwWuEFDdhHmsSaVzZxXOV
+1970-12-31 04:59:59.999        姚123GyYqQMLwWuEFDdhHmsSaVzZxXOV
+1996-03-31 07:03:33.123        姚123GyYqQMLwWuEFDdhHmsSaVzZxXOV
+2018-11-17 05:33:33.123        姚123GyYqQMLwWuEFDdhHmsSaVzZxXOV
+2019-12-31 09:33:33.123        姚123GyYqQMLwWuEFDdhHmsSaVzZxXOV
+2100-01-01 01:33:33.123        姚123GyYqQMLwWuEFDdhHmsSaVzZxXOV
+
+
+-- !query
+select col, date_format(col, "''") from v
+-- !query schema
+struct<col:timestamp,date_format(col, ''):string>
+-- !query output
+1582-05-31 19:40:35.123        '
+1969-12-31 15:00:00    '
+1970-12-31 04:59:59.999        '
+1996-03-31 07:03:33.123        '
+2018-11-17 05:33:33.123        '
+2019-12-31 09:33:33.123        '
+2100-01-01 01:33:33.123        '
+
+
+-- !query
+select col, date_format(col, '') from v
+-- !query schema
+struct<col:timestamp,date_format(col, ):string>
+-- !query output
+1582-05-31 19:40:35.123        
+1969-12-31 15:00:00    
+1970-12-31 04:59:59.999        
+1996-03-31 07:03:33.123        
+2018-11-17 05:33:33.123        
+2019-12-31 09:33:33.123        
+2100-01-01 01:33:33.123
diff --git 
a/sql/core/src/test/resources/sql-tests/results/datetime-legacy.sql.out 
b/sql/core/src/test/resources/sql-tests/results/datetime-legacy.sql.out
index 71b1064..b244f49 100644
--- a/sql/core/src/test/resources/sql-tests/results/datetime-legacy.sql.out
+++ b/sql/core/src/test/resources/sql-tests/results/datetime-legacy.sql.out
@@ -1,5 +1,5 @@
 -- Automatically generated by SQLQueryTestSuite
--- Number of queries: 112
+-- Number of queries: 109
 
 
 -- !query
@@ -647,32 +647,6 @@ struct<to_timestamp('S2019-10-06', 
'\'S\'yyyy-MM-dd'):timestamp>
 
 
 -- !query
-select date_format(timestamp '2019-10-06', 'yyyy-MM-dd uuee')
--- !query schema
-struct<>
--- !query output
-java.lang.IllegalArgumentException
-Illegal pattern character 'e'
-
-
--- !query
-select date_format(timestamp '2019-10-06', 'yyyy-MM-dd uucc')
--- !query schema
-struct<>
--- !query output
-java.lang.IllegalArgumentException
-Illegal pattern character 'c'
-
-
--- !query
-select date_format(timestamp '2019-10-06', 'yyyy-MM-dd uuuu')
--- !query schema
-struct<date_format(TIMESTAMP '2019-10-06 00:00:00', yyyy-MM-dd uuuu):string>
--- !query output
-2019-10-06 0007
-
-
--- !query
 select to_timestamp("2019-10-06T10:11:12'12", "yyyy-MM-dd'T'HH:mm:ss''SSSS")
 -- !query schema
 struct<to_timestamp('2019-10-06T10:11:12\'12', 
'yyyy-MM-dd\'T\'HH:mm:ss\'\'SSSS'):timestamp>
diff --git a/sql/core/src/test/resources/sql-tests/results/datetime.sql.out 
b/sql/core/src/test/resources/sql-tests/results/datetime.sql.out
index 9b1c847..3255761 100755
--- a/sql/core/src/test/resources/sql-tests/results/datetime.sql.out
+++ b/sql/core/src/test/resources/sql-tests/results/datetime.sql.out
@@ -1,5 +1,5 @@
 -- Automatically generated by SQLQueryTestSuite
--- Number of queries: 112
+-- Number of queries: 109
 
 
 -- !query
@@ -647,32 +647,6 @@ struct<to_timestamp('S2019-10-06', 
'\'S\'yyyy-MM-dd'):timestamp>
 
 
 -- !query
-select date_format(timestamp '2019-10-06', 'yyyy-MM-dd uuee')
--- !query schema
-struct<>
--- !query output
-java.lang.IllegalArgumentException
-Illegal pattern character: e
-
-
--- !query
-select date_format(timestamp '2019-10-06', 'yyyy-MM-dd uucc')
--- !query schema
-struct<>
--- !query output
-java.lang.IllegalArgumentException
-Illegal pattern character: c
-
-
--- !query
-select date_format(timestamp '2019-10-06', 'yyyy-MM-dd uuuu')
--- !query schema
-struct<date_format(TIMESTAMP '2019-10-06 00:00:00', yyyy-MM-dd uuuu):string>
--- !query output
-2019-10-06 Sunday
-
-
--- !query
 select to_timestamp("2019-10-06T10:11:12'12", "yyyy-MM-dd'T'HH:mm:ss''SSSS")
 -- !query schema
 struct<to_timestamp('2019-10-06T10:11:12\'12', 
'yyyy-MM-dd\'T\'HH:mm:ss\'\'SSSS'):timestamp>


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to