This is an automated email from the ASF dual-hosted git repository.
dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new a511ca13 [SPARK-38534][SQL][TESTS] Disable `to_timestamp('366', 'DD')`
test case
a511ca13 is described below
commit a511ca13ab392a620e2731d217cc273de9cf1b10
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Sat Mar 12 02:47:57 2022 -0800
[SPARK-38534][SQL][TESTS] Disable `to_timestamp('366', 'DD')` test case
### What changes were proposed in this pull request?
This PR aims to disable `to_timestamp('366', 'DD')` to recover `ansi` test
suite in Java11+.
### Why are the changes needed?
Currently, Daily Java 11 and 17 GitHub Action jobs are broken.
- https://github.com/apache/spark/runs/5511239176?check_suite_focus=true
- https://github.com/apache/spark/runs/5513540604?check_suite_focus=true
**Java 8**
```
$ bin/spark-shell --conf spark.sql.ansi.enabled=true
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use
setLogLevel(newLevel).
22/03/12 00:59:31 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
Spark context Web UI available at http://172.16.0.31:4040
Spark context available as 'sc' (master = local[*], app id =
local-1647075572229).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 3.3.0-SNAPSHOT
/_/
Using Scala version 2.12.15 (OpenJDK 64-Bit Server VM, Java 1.8.0_322)
Type in expressions to have them evaluated.
Type :help for more information.
scala> sql("select to_timestamp('366', 'DD')").show
java.time.format.DateTimeParseException: Text '366' could not be parsed,
unparsed text found at index 2. If necessary set spark.sql.ansi.enabled to
false to bypass this error.
```
**Java 11+**
```
$ bin/spark-shell --conf spark.sql.ansi.enabled=true
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use
setLogLevel(newLevel).
22/03/12 01:00:07 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
Spark context Web UI available at http://172.16.0.31:4040
Spark context available as 'sc' (master = local[*], app id =
local-1647075607932).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 3.3.0-SNAPSHOT
/_/
Using Scala version 2.12.15 (OpenJDK 64-Bit Server VM, Java 11.0.12)
Type in expressions to have them evaluated.
Type :help for more information.
scala> sql("select to_timestamp('366', 'DD')").show
java.time.DateTimeException: Invalid date 'DayOfYear 366' as '1970' is not
a leap year. If necessary set spark.sql.ansi.enabled to false to bypass this
error.
```
### Does this PR introduce _any_ user-facing change?
No.
### How was this patch tested?
Test with Java 11+.
**BEFORE**
```
$ java -version
openjdk version "17.0.2" 2022-01-18 LTS
OpenJDK Runtime Environment Zulu17.32+13-CA (build 17.0.2+8-LTS)
OpenJDK 64-Bit Server VM Zulu17.32+13-CA (build 17.0.2+8-LTS, mixed mode,
sharing)
$ build/sbt "sql/testOnly org.apache.spark.sql.SQLQueryTestSuite -- -z
ansi/datetime-parsing-invalid.sql"
...
[info] SQLQueryTestSuite:
01:23:00.219 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java classes where
applicable
01:23:05.209 ERROR org.apache.spark.sql.SQLQueryTestSuite: Error using
configs:
[info] - ansi/datetime-parsing-invalid.sql *** FAILED *** (267 milliseconds)
[info] ansi/datetime-parsing-invalid.sql
[info] Expected "java.time.[format.DateTimeParseException
[info] Text '366' could not be parsed, unparsed text found at index 2].
If necessary set s...", but got "java.time.[DateTimeException
[info] Invalid date 'DayOfYear 366' as '1970' is not a leap year]. If
necessary set s..." Result did not match for query #8
[info] select to_timestamp('366', 'DD') (SQLQueryTestSuite.scala:476)
...
[info] Run completed in 7 seconds, 389 milliseconds.
[info] Total number of tests run: 1
[info] Suites: completed 1, aborted 0
[info] Tests: succeeded 0, failed 1, canceled 0, ignored 0, pending 0
[info] *** 1 TEST FAILED ***
[error] Failed tests:
[error] org.apache.spark.sql.SQLQueryTestSuite
[error] (sql / Test / testOnly) sbt.TestsFailedException: Tests unsuccessful
[error] Total time: 21 s, completed Mar 12, 2022, 1:23:05 AM
```
**AFTER**
```
$ build/sbt "sql/testOnly org.apache.spark.sql.SQLQueryTestSuite -- -z
ansi/datetime-parsing-invalid.sql"
...
[info] SQLQueryTestSuite:
[info] - ansi/datetime-parsing-invalid.sql (390 milliseconds)
...
[info] Run completed in 7 seconds, 673 milliseconds.
[info] Total number of tests run: 1
[info] Suites: completed 1, aborted 0
[info] Tests: succeeded 1, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[success] Total time: 20 s, completed Mar 12, 2022, 1:24:52 AM
```
Closes #35825 from dongjoon-hyun/SPARK-38534.
Authored-by: Dongjoon Hyun <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
---
.../resources/sql-tests/inputs/datetime-parsing-invalid.sql | 3 ++-
.../sql-tests/results/ansi/datetime-parsing-invalid.sql.out | 11 +----------
.../sql-tests/results/datetime-parsing-invalid.sql.out | 10 +---------
3 files changed, 4 insertions(+), 20 deletions(-)
diff --git
a/sql/core/src/test/resources/sql-tests/inputs/datetime-parsing-invalid.sql
b/sql/core/src/test/resources/sql-tests/inputs/datetime-parsing-invalid.sql
index a6d743c..1d1e2a5 100644
--- a/sql/core/src/test/resources/sql-tests/inputs/datetime-parsing-invalid.sql
+++ b/sql/core/src/test/resources/sql-tests/inputs/datetime-parsing-invalid.sql
@@ -14,7 +14,8 @@ select to_timestamp('366', 'D');
select to_timestamp('9', 'DD');
-- in java 8 this case is invalid, but valid in java 11, disabled for jenkins
-- select to_timestamp('100', 'DD');
-select to_timestamp('366', 'DD');
+-- The error message is changed since Java 11+
+-- select to_timestamp('366', 'DD');
select to_timestamp('9', 'DDD');
select to_timestamp('99', 'DDD');
select to_timestamp('30-365', 'dd-DDD');
diff --git
a/sql/core/src/test/resources/sql-tests/results/ansi/datetime-parsing-invalid.sql.out
b/sql/core/src/test/resources/sql-tests/results/ansi/datetime-parsing-invalid.sql.out
index 5dc3b85..59761d5 100644
---
a/sql/core/src/test/resources/sql-tests/results/ansi/datetime-parsing-invalid.sql.out
+++
b/sql/core/src/test/resources/sql-tests/results/ansi/datetime-parsing-invalid.sql.out
@@ -1,5 +1,5 @@
-- Automatically generated by SQLQueryTestSuite
--- Number of queries: 29
+-- Number of queries: 28
-- !query
@@ -75,15 +75,6 @@ You may get a different result due to the upgrading to Spark
>= 3.0: Fail to par
-- !query
-select to_timestamp('366', 'DD')
--- !query schema
-struct<>
--- !query output
-java.time.format.DateTimeParseException
-Text '366' could not be parsed, unparsed text found at index 2. If necessary
set spark.sql.ansi.enabled to false to bypass this error.
-
-
--- !query
select to_timestamp('9', 'DDD')
-- !query schema
struct<>
diff --git
a/sql/core/src/test/resources/sql-tests/results/datetime-parsing-invalid.sql.out
b/sql/core/src/test/resources/sql-tests/results/datetime-parsing-invalid.sql.out
index 3350470..9fc2887 100644
---
a/sql/core/src/test/resources/sql-tests/results/datetime-parsing-invalid.sql.out
+++
b/sql/core/src/test/resources/sql-tests/results/datetime-parsing-invalid.sql.out
@@ -1,5 +1,5 @@
-- Automatically generated by SQLQueryTestSuite
--- Number of queries: 29
+-- Number of queries: 28
-- !query
@@ -73,14 +73,6 @@ You may get a different result due to the upgrading to Spark
>= 3.0: Fail to par
-- !query
-select to_timestamp('366', 'DD')
--- !query schema
-struct<to_timestamp(366, DD):timestamp>
--- !query output
-NULL
-
-
--- !query
select to_timestamp('9', 'DDD')
-- !query schema
struct<>
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]