[GitHub] [spark] yaooqinn commented on pull request #28650: [SPARK-31830][SQL] Consistent error handling for datetime formatting and parsing functions

2020-05-29 Thread GitBox
yaooqinn commented on pull request #28650: URL: https://github.com/apache/spark/pull/28650#issuecomment-635783265 OK This is an automated message from the Apache Git Service. To respond to the message, please log on to

[GitHub] [spark] yaooqinn commented on pull request #28650: [SPARK-31830][SQL] Consistent error handling for datetime formatting and parsing functions

2020-05-28 Thread GitBox
yaooqinn commented on pull request #28650: URL: https://github.com/apache/spark/pull/28650#issuecomment-635449894 But maybe we should apply `SparkUpgradeException ` to `format()` as `parse()` for better error msg for end-users

[GitHub] [spark] yaooqinn commented on pull request #28650: [SPARK-31830][SQL] Consistent error handling for datetime formatting and parsing functions

2020-05-28 Thread GitBox
yaooqinn commented on pull request #28650: URL: https://github.com/apache/spark/pull/28650#issuecomment-635448724 No the pattern `yyy-MM-dd` is valid for both version of formatters, but calling the `format()` throws an exception in the new one but silently suppressed in

[GitHub] [spark] yaooqinn commented on pull request #28650: [SPARK-31830][SQL] Consistent error handling for datetime formatting and parsing functions

2020-05-28 Thread GitBox
yaooqinn commented on pull request #28650: URL: https://github.com/apache/spark/pull/28650#issuecomment-635436944 > ``` > spark-sql> select from_unixtime(1, 'yyy-MM-dd'); > NULL > ``` > > Why we don't throw `SparkUpgradeException` in this case? that logic