gengliangwang commented on a change in pull request #32995:
URL: https://github.com/apache/spark/pull/32995#discussion_r655148980



##########
File path: sql/core/src/test/resources/sql-tests/results/ansi/datetime.sql.out
##########
@@ -936,6 +1198,33 @@ org.apache.spark.SparkUpgradeException
 You may get a different result due to the upgrading of Spark 3.0: Fail to 
recognize 'dd MM yyyy EEEEE' pattern in the DateTimeFormatter. 1) You can set 
spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before 
Spark 3.0. 2) You can form a valid datetime pattern with the guide from 
https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html
 
 
+-- !query
+select to_timestamp_ntz('2019-10-06 A', 'yyyy-MM-dd GGGGG')
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.SparkUpgradeException
+You may get a different result due to the upgrading of Spark 3.0: Fail to 
recognize 'yyyy-MM-dd GGGGG' pattern in the DateTimeFormatter. 1) You can set 
spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before 
Spark 3.0. 2) You can form a valid datetime pattern with the guide from 
https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html

Review comment:
       I will have a follow up to improve the error message.
   This Error message is used in multiple SQL output files and some of them are 
not related to this PR.

##########
File path: sql/core/src/test/resources/sql-tests/results/ansi/datetime.sql.out
##########
@@ -936,6 +1198,33 @@ org.apache.spark.SparkUpgradeException
 You may get a different result due to the upgrading of Spark 3.0: Fail to 
recognize 'dd MM yyyy EEEEE' pattern in the DateTimeFormatter. 1) You can set 
spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before 
Spark 3.0. 2) You can form a valid datetime pattern with the guide from 
https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html
 
 
+-- !query
+select to_timestamp_ntz('2019-10-06 A', 'yyyy-MM-dd GGGGG')
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.SparkUpgradeException
+You may get a different result due to the upgrading of Spark 3.0: Fail to 
recognize 'yyyy-MM-dd GGGGG' pattern in the DateTimeFormatter. 1) You can set 
spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before 
Spark 3.0. 2) You can form a valid datetime pattern with the guide from 
https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html

Review comment:
       I will have a follow up to improve the error message.
   This error message is used in multiple SQL output files and some of them are 
not related to this PR.

##########
File path: 
sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkExecuteStatementOperation.scala
##########
@@ -379,6 +379,7 @@ object SparkExecuteStatementOperation {
         case CalendarIntervalType => StringType.catalogString
         case _: YearMonthIntervalType => "interval_year_month"
         case _: DayTimeIntervalType => "interval_day_time"
+        case _: TimestampWithoutTZType => "timestamp"

Review comment:
       It seems that there is not better choice from: 
https://github.com/apache/hive/blob/master/serde/src/java/org/apache/hadoop/hive/serde2/thrift/Type.java#L230

##########
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/TimestampFormatter.scala
##########
@@ -50,9 +51,31 @@ sealed trait TimestampFormatter extends Serializable {
   @throws(classOf[DateTimeException])
   def parse(s: String): Long
 
+  /**
+   * Parses a timestamp in a string and converts it to microseconds since Unix 
Epoch in local time.
+   *
+   * @param s - string with timestamp to parse
+   * @return microseconds since epoch.
+   * @throws ParseException can be thrown by legacy parser
+   * @throws DateTimeParseException can be thrown by new parser
+   * @throws DateTimeException unable to obtain local date or time
+   * @throws UnsupportedOperationException cannot use legacy formatter for the 
parsing
+   */
+  @throws(classOf[ParseException])
+  @throws(classOf[DateTimeParseException])
+  @throws(classOf[DateTimeException])
+  @throws(classOf[UnsupportedOperationException])
+  def parseWithoutTimeZone(s: String): Long =
+    throw QueryExecutionErrors.cannotUseLegacyFormatterForTimestampWithoutTZ();

Review comment:
       The error here won't be hit since the formatter for TimestampWithoutTZ 
is always Iso8601TimestampFormatter

##########
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/datetimeExpressions.scala
##########
@@ -66,6 +66,10 @@ trait TimestampFormatterHelper extends 
TimeZoneAwareExpression {
 
   protected def isParsing: Boolean
 
+  // Whether the timestamp formatter is for TimestampWithoutTZType.
+  // If yes, the formatter is always `Iso8601TimestampFormatter`.

Review comment:
       Well I prefer to put all the logic in `TimestampFormatter.getFormatter`

##########
File path: 
sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkExecuteStatementOperation.scala
##########
@@ -379,6 +379,7 @@ object SparkExecuteStatementOperation {
         case CalendarIntervalType => StringType.catalogString
         case _: YearMonthIntervalType => "interval_year_month"
         case _: DayTimeIntervalType => "interval_day_time"
+        case _: TimestampWithoutTZType => "timestamp"

Review comment:
       +1




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to