Hyukjin Kwon created SPARK-34938:
------------------------------------
Summary: Recover the interval case in the benchmark of
ExtractBenchmark
Key: SPARK-34938
URL: https://issues.apache.org/jira/browse/SPARK-34938
Project: Spark
Issue Type: Bug
Components: Tests
Affects Versions: 3.2.0
Reporter: Hyukjin Kwon
{code}
Running benchmark: Invoke extract for interval
Running case: cast to interval
21/04/02 10:40:01 INFO BlockManagerInfo: Removed broadcast_349_piece0 on
192.168.35.219:55076 in memory (size: 7.3 KiB, free: 434.4 MiB)
21/04/02 10:40:01 INFO BlockManagerInfo: Removed broadcast_348_piece0 on
192.168.35.219:55076 in memory (size: 7.3 KiB, free: 434.4 MiB)
21/04/02 10:40:01 INFO BlockManagerInfo: Removed broadcast_345_piece0 on
192.168.35.219:55076 in memory (size: 7.3 KiB, free: 434.4 MiB)
21/04/02 10:40:01 INFO BlockManagerInfo: Removed broadcast_346_piece0 on
192.168.35.219:55076 in memory (size: 7.3 KiB, free: 434.4 MiB)
21/04/02 10:40:01 INFO BlockManagerInfo: Removed broadcast_347_piece0 on
192.168.35.219:55076 in memory (size: 7.3 KiB, free: 434.4 MiB)
Exception in thread "main" org.apache.spark.sql.AnalysisException: cannot
resolve 'subtractdates(CAST(timestamp_seconds(id) AS DATE), DATE '0001-01-01')
+ subtracttimestamps(timestamp_seconds(id), TIMESTAMP '1000-01-01
01:02:03.123456')' due to data type mismatch: argument 1 requires timestamp
type, however, 'subtractdates(CAST(timestamp_seconds(id) AS DATE), DATE
'0001-01-01')' is of day-time interval type.; line 1 pos 0;
'Project [unresolvedalias(cast(subtractdates(cast(timestamp_seconds(id#1400L)
as date), 0001-01-01, false) + subtracttimestamps(timestamp_seconds(id#1400L),
1000-01-01 01:02:03.123456) as day-time interval),
Some(org.apache.spark.sql.Column$$Lambda$1282/0x0000000800bd5040@5bff7f13))]
+- Range (1262304000, 1272304000, step=1, splits=Some(1))
{code}
Benchmark case is broken.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]