cloud-fan commented on code in PR #32959:
URL: https://github.com/apache/spark/pull/32959#discussion_r884900931
##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateTimeUtils.scala:
##########
@@ -518,44 +516,55 @@ object DateTimeUtils {
* The return type is [[Option]] in order to distinguish between 0 and null.
The following
* formats are allowed:
*
- * `yyyy`
- * `yyyy-[m]m`
- * `yyyy-[m]m-[d]d`
- * `yyyy-[m]m-[d]d `
- * `yyyy-[m]m-[d]d *`
- * `yyyy-[m]m-[d]dT*`
+ * `[+-]yyyy*`
+ * `[+-]yyyy*-[m]m`
+ * `[+-]yyyy*-[m]m-[d]d`
+ * `[+-]yyyy*-[m]m-[d]d `
+ * `[+-]yyyy*-[m]m-[d]d *`
+ * `[+-]yyyy*-[m]m-[d]dT*`
*/
def stringToDate(s: UTF8String): Option[Int] = {
- if (s == null) {
+ def isValidDigits(segment: Int, digits: Int): Boolean = {
+ // An integer is able to represent a date within [+-]5 million years.
+ var maxDigitsYear = 7
Review Comment:
BTW, I don't think it's possible to add a Spark config to forbid large
datetime values. The literal is just one place, there are many other datetime
operations that may produce large datetime values, which have been there before
this PR.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]