[GitHub] [spark] yaooqinn commented on a change in pull request #32714: [SPARK-35581][SQL] Support special datetime values in typed literals only
yaooqinn commented on a change in pull request #32714: URL: https://github.com/apache/spark/pull/32714#discussion_r642885340 ## File path: docs/sql-migration-guide.md ## @@ -91,6 +91,8 @@ license: | - In Spark 3.2, `CREATE TABLE AS SELECT` with non-empty `LOCATION` will throw `AnalysisException`. To restore the behavior before Spark 3.2, you can set `spark.sql.legacy.allowNonEmptyLocationInCTAS` to `true`. + - In Spark 3.2, the special datetime values such as `epoch`, `today`, `yesterday`, `tomorrow` and `now` are supported in typed literals only, for instance `select timestamp'now'`. In Spark 3.1 and earlier, such special values are supported in any casts of strings to dates/timestamps. To restore the behavior before Spark 3.2, you should preprocess string columns and convert the strings to desired timestamps explicitly using UDF for instance. Review comment: How about "to keep these special values as datetimes in Spark 3.1 and 3.0, you need to match them manually, e.g. `if(c in ('now', 'today'), current_date(), c)`". I think it's better to suggest user use builtin functions than UDFs -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] yaooqinn commented on a change in pull request #32714: [SPARK-35581][SQL] Support special datetime values in typed literals only
yaooqinn commented on a change in pull request #32714: URL: https://github.com/apache/spark/pull/32714#discussion_r642885340 ## File path: docs/sql-migration-guide.md ## @@ -91,6 +91,8 @@ license: | - In Spark 3.2, `CREATE TABLE AS SELECT` with non-empty `LOCATION` will throw `AnalysisException`. To restore the behavior before Spark 3.2, you can set `spark.sql.legacy.allowNonEmptyLocationInCTAS` to `true`. + - In Spark 3.2, the special datetime values such as `epoch`, `today`, `yesterday`, `tomorrow` and `now` are supported in typed literals only, for instance `select timestamp'now'`. In Spark 3.1 and earlier, such special values are supported in any casts of strings to dates/timestamps. To restore the behavior before Spark 3.2, you should preprocess string columns and convert the strings to desired timestamps explicitly using UDF for instance. Review comment: How about "to keep these special values as datetimes in Spark 3.1 and 3.0, you need to match them manually, e.g. `if(c in ('now', 'today'), current_date() else c)`". I think it's better to suggest user use builtin functions than UDFs -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] yaooqinn commented on a change in pull request #32714: [SPARK-35581][SQL] Support special datetime values in typed literals only
yaooqinn commented on a change in pull request #32714: URL: https://github.com/apache/spark/pull/32714#discussion_r642885340 ## File path: docs/sql-migration-guide.md ## @@ -91,6 +91,8 @@ license: | - In Spark 3.2, `CREATE TABLE AS SELECT` with non-empty `LOCATION` will throw `AnalysisException`. To restore the behavior before Spark 3.2, you can set `spark.sql.legacy.allowNonEmptyLocationInCTAS` to `true`. + - In Spark 3.2, the special datetime values such as `epoch`, `today`, `yesterday`, `tomorrow` and `now` are supported in typed literals only, for instance `select timestamp'now'`. In Spark 3.1 and earlier, such special values are supported in any casts of strings to dates/timestamps. To restore the behavior before Spark 3.2, you should preprocess string columns and convert the strings to desired timestamps explicitly using UDF for instance. Review comment: How about "to keep these special values as datetimes in Spark 3.1 and 3.0, you need to match them manually, e.g. `if(c in ('now', 'today'), current_date() else c)`. I think it's better to suggest user use builtin functions than UDFs -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] yaooqinn commented on a change in pull request #32714: [SPARK-35581][SQL] Support special datetime values in typed literals only
yaooqinn commented on a change in pull request #32714: URL: https://github.com/apache/spark/pull/32714#discussion_r642877570 ## File path: docs/sql-migration-guide.md ## @@ -91,6 +91,8 @@ license: | - In Spark 3.2, `CREATE TABLE AS SELECT` with non-empty `LOCATION` will throw `AnalysisException`. To restore the behavior before Spark 3.2, you can set `spark.sql.legacy.allowNonEmptyLocationInCTAS` to `true`. + - In Spark 3.2, the special datetime values such as `epoch`, `today`, `yesterday`, `tomorrow` and `now` are supported in typed literals only, for instance `select timestamp'now'`. In Spark 3.1 and earlier, such special values are supported in any casts of strings to dates/timestamps. To restore the behavior before Spark 3.2, you should preprocess string columns and convert the strings to desired timestamps explicitly using UDF for instance. Review comment: Hi @MaxGekk, thanks for your suggestion. I think if users need to preprocess the data, we may not call it as`To restore the behavior before Spark 3.2` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] yaooqinn commented on a change in pull request #32714: [SPARK-35581][SQL] Support special datetime values in typed literals only
yaooqinn commented on a change in pull request #32714: URL: https://github.com/apache/spark/pull/32714#discussion_r642875875 ## File path: docs/sql-migration-guide.md ## @@ -91,6 +91,8 @@ license: | - In Spark 3.2, `CREATE TABLE AS SELECT` with non-empty `LOCATION` will throw `AnalysisException`. To restore the behavior before Spark 3.2, you can set `spark.sql.legacy.allowNonEmptyLocationInCTAS` to `true`. + - In Spark 3.2, special datetime values such as `epoch`, `today`, `yesterday`, `tomorrow` and `now` are supported in typed literals only, for instance `select timestamp'now'`. In Spark 3.1 and 3.0, such special values are supported in any casts of strings to dates/timestamps. To restore the behavior before Spark 3.2, you should preprocess string columns and convert the strings to desired dates/timestamps explicitly using UDF for instance. Review comment: ```suggestion - In Spark 3.2, special datetime values such as `epoch`, `today`, `yesterday`, `tomorrow`, and `now` are supported in typed literals only, for instance, `select timestamp'now'`. In Spark 3.1 and 3.0, such special values are supported in any casts of strings to dates/timestamps. To restore the behavior before Spark 3.2, you should preprocess string columns and convert the strings to desired dates/timestamps explicitly using UDF for instance. ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] yaooqinn commented on a change in pull request #32714: [SPARK-35581][SQL] Support special datetime values in typed literals only
yaooqinn commented on a change in pull request #32714: URL: https://github.com/apache/spark/pull/32714#discussion_r642757369 ## File path: docs/sql-migration-guide.md ## @@ -91,6 +91,8 @@ license: | - In Spark 3.2, `CREATE TABLE AS SELECT` with non-empty `LOCATION` will throw `AnalysisException`. To restore the behavior before Spark 3.2, you can set `spark.sql.legacy.allowNonEmptyLocationInCTAS` to `true`. + - In Spark 3.2, the special datetime values such as `epoch`, `today`, `yesterday`, `tomorrow` and `now` are supported in typed literals only, for instance `select timestamp'now'`. In Spark 3.1 and earlier, such special values are supported in any casts of strings to dates/timestamps. To restore the behavior before Spark 3.2, you should preprocess string columns and convert the strings to desired timestamps explicitly using UDF for instance. Review comment: In Spark 3.2, ~the~ special datetime values. in typed literals only, for instance **(add',')** `select timestamp'now'`. In Spark 3.1 and ~earlier~ (3.0?) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org