MaxGekk commented on code in PR #48893:
URL: https://github.com/apache/spark/pull/48893#discussion_r1851008337
##########
sql/core/src/test/resources/sql-tests/results/window.sql.out:
##########
@@ -1442,3 +1442,69 @@ a 1 2
a NULL 1
b 1 1
b 2 2
+
+
+-- !query
+SELECT *, mean(val_double) over (partition BY val ORDER BY val_date RANGE
INTERVAL '5' DAY PRECEDING) AS mean FROM testData
+-- !query schema
+struct<val:int,val_long:bigint,val_double:double,val_date:date,val_timestamp:timestamp,cate:string,mean:double>
+-- !query output
+1 1 1.0 2017-08-01 2017-07-31 17:00:00 a 1.0
+1 2 2.5 2017-08-02 2017-08-05 23:13:20 a 1.5
+1 NULL 1.0 2017-08-01 2017-07-31 17:00:00 b 1.0
+2 2147483650 100.001 2020-12-31 2020-12-30 16:00:00 a
100.001
+2 3 3.3 2017-08-03 2017-08-17 13:00:00 b 3.3
+3 1 1.0 2017-08-01 2017-07-31 17:00:00 NULL 1.0
+3 2147483650 100.001 2020-12-31 2020-12-30 16:00:00 b
100.001
+NULL 1 1.0 2017-08-01 2017-07-31 17:00:00 a 1.0
+NULL NULL NULL NULL NULL NULL NULL
+
+
+-- !query
+SELECT *, mean(val_double) over (partition BY val ORDER BY val_date RANGE
INTERVAL '1 2:3:4.001' DAY TO SECOND PRECEDING) AS mean FROM testData
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.sql.catalyst.ExtendedAnalysisException
+{
+ "errorClass" : "DATATYPE_MISMATCH.RANGE_FRAME_INVALID_TYPE",
Review Comment:
The full error message after substitution confuses slightly:
```
The data type "DATE" used in the order specification does not match the data
type "INTERVAL DAY TO SECOND" which is used in the range frame.
```
I think we could improve the error message. @mihailom-db Could you open an
JIRA for this, please..
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]