MaxGekk commented on code in PR #47846:
URL: https://github.com/apache/spark/pull/47846#discussion_r1733291164


##########
common/utils/src/main/resources/error/error-conditions.json:
##########
@@ -1914,6 +1914,44 @@
     ],
     "sqlState" : "22012"
   },
+  "INTERVAL_ERROR" : {
+    "message" : [
+      "Interval error."
+    ],
+    "subClass" : {
+      "DAY_TIME_PARSING" : {

Review Comment:
   Spark doesn't have an arithmetic intervals, only datetime intervals.
   
   Let's think of moving some interval parsing errors to 
`INVALID_INTERVAL_FORMAT`.



##########
common/utils/src/main/resources/error/error-conditions.json:
##########
@@ -1914,6 +1914,49 @@
     ],
     "sqlState" : "22012"
   },
+  "INTERVAL_ERROR" : {
+    "message" : [
+      "Interval error."
+    ],
+    "subClass" : {
+      "DAY_TIME_PARSING" : {
+        "message" : [
+          "Error parsing interval day-time string: <msg>."
+        ]
+      },
+      "ILLEGAL_DAY_OF_WEEK" : {
+        "message" : [
+          "Illegal input for day of week: <string>."
+        ]
+      },
+      "INTERVAL_PARSING" : {
+        "message" : [
+          "Error parsing interval <interval> string: <msg>."
+        ]
+      },
+      "SECOND_NANO_FORMAT" : {
+        "message" : [
+          "Interval string does not match second-nano format of ss.nnnnnnnnn."
+        ]
+      },
+      "UNMATCHED_FORMAT_STRING" : {
+        "message" : [
+          "Interval string does not match <intervalStr> format of 
<supportedFormat> when cast to <typeName>: <input>."
+        ]
+      },
+      "UNMATCHED_FORMAT_STRING_WITH_NOTICE" : {
+        "message" : [
+          "Interval string does not match <intervalStr> format of 
<supportedFormat> when cast to <typeName>: <input>, set 
spark.sql.legacy.fromDayTimeString.enabled to true to restore the behavior 
before Spark 3.0."

Review Comment:
   Let's follow the behaviour of `toSQLConf` and `toSQLConfVal`:
   ```suggestion
             "Interval string does not match <intervalStr> format of 
<supportedFormat> when cast to <typeName>: <input>. Set 
\"spark.sql.legacy.fromDayTimeString.enabled\" to \"true\" to restore the 
behavior before Spark 3.0."
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to