[jira] [Commented] (SPARK-38324) The second range is not [0, 59] in the day time ANSI interval

2023-02-15 Thread Apache Spark (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-38324?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17688938#comment-17688938
 ] 

Apache Spark commented on SPARK-38324:
--

User 'haoyanzhang' has created a pull request for this issue:
https://github.com/apache/spark/pull/40033

> The second range is not [0, 59] in the day time ANSI interval
> -
>
> Key: SPARK-38324
> URL: https://issues.apache.org/jira/browse/SPARK-38324
> Project: Spark
>  Issue Type: Bug
>  Components: Java API
>Affects Versions: 3.3.0
> Environment: Spark 3.3.0 snapshot
>Reporter: chong
>Priority: Major
>
> [https://spark.apache.org/docs/latest/sql-ref-datatypes.html]
>  * SECOND, seconds within minutes and possibly fractions of a second 
> [0..59.99]}}{}}}
> {{Doc shows SECOND is seconds within minutes, it's range should be [0, 59]}}
>  
> But testing shows 99 second is valid:
> {{>>> spark.sql("select INTERVAL '10 01:01:99' DAY TO SECOND")}}
> {{{}DataFrame[INTERVAL '10 01:02:39' DAY TO SECOND: interval day to 
> second]{}}}}}{}}}
>  
> Meanwhile, minute range check is ok, see below:
> >>> spark.sql("select INTERVAL '10 01:60:01' DAY TO SECOND")
> requirement failed: {color:#de350b}*minute 60 outside range [0, 
> 59]*{color}(line 1, pos 16)
> == SQL ==
> select INTERVAL '10 01:60:01' DAY TO SECOND
> ^^^
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-38324) The second range is not [0, 59] in the day time ANSI interval

2023-02-14 Thread Apache Spark (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-38324?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17688807#comment-17688807
 ] 

Apache Spark commented on SPARK-38324:
--

User 'haoyanzhang' has created a pull request for this issue:
https://github.com/apache/spark/pull/40028

> The second range is not [0, 59] in the day time ANSI interval
> -
>
> Key: SPARK-38324
> URL: https://issues.apache.org/jira/browse/SPARK-38324
> Project: Spark
>  Issue Type: Bug
>  Components: Java API
>Affects Versions: 3.3.0
> Environment: Spark 3.3.0 snapshot
>Reporter: chong
>Priority: Major
>
> [https://spark.apache.org/docs/latest/sql-ref-datatypes.html]
>  * SECOND, seconds within minutes and possibly fractions of a second 
> [0..59.99]}}{}}}
> {{Doc shows SECOND is seconds within minutes, it's range should be [0, 59]}}
>  
> But testing shows 99 second is valid:
> {{>>> spark.sql("select INTERVAL '10 01:01:99' DAY TO SECOND")}}
> {{{}DataFrame[INTERVAL '10 01:02:39' DAY TO SECOND: interval day to 
> second]{}}}}}{}}}
>  
> Meanwhile, minute range check is ok, see below:
> >>> spark.sql("select INTERVAL '10 01:60:01' DAY TO SECOND")
> requirement failed: {color:#de350b}*minute 60 outside range [0, 
> 59]*{color}(line 1, pos 16)
> == SQL ==
> select INTERVAL '10 01:60:01' DAY TO SECOND
> ^^^
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-38324) The second range is not [0, 59] in the day time ANSI interval

2023-02-14 Thread Apache Spark (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-38324?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17688806#comment-17688806
 ] 

Apache Spark commented on SPARK-38324:
--

User 'haoyanzhang' has created a pull request for this issue:
https://github.com/apache/spark/pull/40028

> The second range is not [0, 59] in the day time ANSI interval
> -
>
> Key: SPARK-38324
> URL: https://issues.apache.org/jira/browse/SPARK-38324
> Project: Spark
>  Issue Type: Bug
>  Components: Java API
>Affects Versions: 3.3.0
> Environment: Spark 3.3.0 snapshot
>Reporter: chong
>Priority: Major
>
> [https://spark.apache.org/docs/latest/sql-ref-datatypes.html]
>  * SECOND, seconds within minutes and possibly fractions of a second 
> [0..59.99]}}{}}}
> {{Doc shows SECOND is seconds within minutes, it's range should be [0, 59]}}
>  
> But testing shows 99 second is valid:
> {{>>> spark.sql("select INTERVAL '10 01:01:99' DAY TO SECOND")}}
> {{{}DataFrame[INTERVAL '10 01:02:39' DAY TO SECOND: interval day to 
> second]{}}}}}{}}}
>  
> Meanwhile, minute range check is ok, see below:
> >>> spark.sql("select INTERVAL '10 01:60:01' DAY TO SECOND")
> requirement failed: {color:#de350b}*minute 60 outside range [0, 
> 59]*{color}(line 1, pos 16)
> == SQL ==
> select INTERVAL '10 01:60:01' DAY TO SECOND
> ^^^
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-38324) The second range is not [0, 59] in the day time ANSI interval

2022-02-24 Thread Hyukjin Kwon (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-38324?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17497905#comment-17497905
 ] 

Hyukjin Kwon commented on SPARK-38324:
--

cc [~Gengliang.Wang] FYI

> The second range is not [0, 59] in the day time ANSI interval
> -
>
> Key: SPARK-38324
> URL: https://issues.apache.org/jira/browse/SPARK-38324
> Project: Spark
>  Issue Type: Bug
>  Components: Java API
>Affects Versions: 3.3.0
> Environment: Spark 3.3.0 snapshot
>Reporter: chong
>Priority: Major
>
> [https://spark.apache.org/docs/latest/sql-ref-datatypes.html]
>  * SECOND, seconds within minutes and possibly fractions of a second 
> [0..59.99]{{{}{}}}
> {{Doc shows SECOND is seconds within minutes, it's range should be [0, 59]}}
>  
> But testing shows 99 second is valid:
> {{>>> spark.sql("select INTERVAL '10 01:01:99' DAY TO SECOND")}}
> {{{}DataFrame[INTERVAL '10 01:02:39' DAY TO SECOND: interval day to 
> second]{}}}{{{}{}}}



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org