This is an automated email from the ASF dual-hosted git repository.
dongjoon pushed a commit to branch branch-3.4
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/branch-3.4 by this push:
new f36325d76cf [SPARK-42770][CONNECT] Add
`truncatedTo(ChronoUnit.MICROS)` to make `SQLImplicitsTestSuite` in Java 17
daily test GA task pass
f36325d76cf is described below
commit f36325d76cf00dd7baef513a20a57686f41b87dd
Author: yangjie01 <[email protected]>
AuthorDate: Tue Mar 14 08:53:15 2023 -0700
[SPARK-42770][CONNECT] Add `truncatedTo(ChronoUnit.MICROS)` to make
`SQLImplicitsTestSuite` in Java 17 daily test GA task pass
### What changes were proposed in this pull request?
Run `LocalDateTime.now()` and `Instant.now()` with Java 8 & 11 always get
microseconds on both Linux and MacOS, but there are some differences when using
Java 17, it will get accurate nanoseconds on Linux, but still get the
microseconds on MacOS.
On Linux(CentOs)
```
jshell> java.time.LocalDateTime.now()
$1 ==> 2023-03-13T18:09:12.498162194
jshell> java.time.Instant.now()
$2 ==> 2023-03-13T10:09:16.013993186Z
```
On MacOS
```
jshell> java.time.LocalDateTime.now()
$1 ==> 2023-03-13T17:13:47.485897
jshell> java.time.Instant.now()
$2 ==> 2023-03-13T09:15:12.031850Z
```
At present, Spark always converts them to microseconds, this will cause
`test implicit encoder resolution` in SQLImplicitsTestSuite test fail when
using Java 17 on Linux, so this pr add `truncatedTo(ChronoUnit.MICROS)` when
testing on Linux using Java 17 to ensure the accuracy of test input data is
also microseconds.
### Why are the changes needed?
Make Java 17 daily test GA task run successfully. The Java 17 daily test GA
task failed as follows:
```
[info] - test implicit encoder resolution *** FAILED *** (1 second, 329
milliseconds)
4429[info] 2023-03-02T23:00:20.404434 did not equal
2023-03-02T23:00:20.404434875 (SQLImplicitsTestSuite.scala:63)
4430[info] org.scalatest.exceptions.TestFailedException:
4431[info] at
org.scalatest.Assertions.newAssertionFailedException(Assertions.scala:472)
4432[info] at
org.scalatest.Assertions.newAssertionFailedException$(Assertions.scala:471)
4433[info] at
org.scalatest.Assertions$.newAssertionFailedException(Assertions.scala:1231)
4434[info] at
org.scalatest.Assertions$AssertionsHelper.macroAssert(Assertions.scala:1295)
4435[info] at
org.apache.spark.sql.SQLImplicitsTestSuite.testImplicit$1(SQLImplicitsTestSuite.scala:63)
4436[info] at
org.apache.spark.sql.SQLImplicitsTestSuite.$anonfun$new$2(SQLImplicitsTestSuite.scala:133)
```
### Does this PR introduce _any_ user-facing change?
No
### How was this patch tested?
Manual checked with Java 17
Closes #40395 from LuciferYang/SPARK-42770.
Authored-by: yangjie01 <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
(cherry picked from commit 3a2571c3978e7271388b5e56267e3bd60c5a4712)
Signed-off-by: Dongjoon Hyun <[email protected]>
---
.../org/apache/spark/sql/SQLImplicitsTestSuite.scala | 20 +++++++++++++++++---
1 file changed, 17 insertions(+), 3 deletions(-)
diff --git
a/connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/SQLImplicitsTestSuite.scala
b/connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/SQLImplicitsTestSuite.scala
index f3261ac4850..470736fbebe 100644
---
a/connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/SQLImplicitsTestSuite.scala
+++
b/connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/SQLImplicitsTestSuite.scala
@@ -18,9 +18,11 @@ package org.apache.spark.sql
import java.sql.{Date, Timestamp}
import java.time.{Duration, Instant, LocalDate, LocalDateTime, Period}
+import java.time.temporal.ChronoUnit
import java.util.concurrent.atomic.AtomicLong
import io.grpc.inprocess.InProcessChannelBuilder
+import org.apache.commons.lang3.{JavaVersion, SystemUtils}
import org.scalatest.BeforeAndAfterAll
import org.apache.spark.connect.proto
@@ -130,9 +132,21 @@ class SQLImplicitsTestSuite extends ConnectFunSuite with
BeforeAndAfterAll {
testImplicit(BigDecimal(decimal))
testImplicit(Date.valueOf(LocalDate.now()))
testImplicit(LocalDate.now())
- testImplicit(LocalDateTime.now())
- testImplicit(Instant.now())
- testImplicit(Timestamp.from(Instant.now()))
+ // SPARK-42770: Run `LocalDateTime.now()` and `Instant.now()` with Java 8
& 11 always
+ // get microseconds on both Linux and MacOS, but there are some
differences when
+ // using Java 17, it will get accurate nanoseconds on Linux, but still get
the microseconds
+ // on MacOS. At present, Spark always converts them to microseconds, this
will cause the
+ // test fail when using Java 17 on Linux, so add
`truncatedTo(ChronoUnit.MICROS)` when
+ // testing on Linux using Java 17 to ensure the accuracy of input data is
microseconds.
+ if (SystemUtils.isJavaVersionAtLeast(JavaVersion.JAVA_17) &&
SystemUtils.IS_OS_LINUX) {
+ testImplicit(LocalDateTime.now().truncatedTo(ChronoUnit.MICROS))
+ testImplicit(Instant.now().truncatedTo(ChronoUnit.MICROS))
+
testImplicit(Timestamp.from(Instant.now().truncatedTo(ChronoUnit.MICROS)))
+ } else {
+ testImplicit(LocalDateTime.now())
+ testImplicit(Instant.now())
+ testImplicit(Timestamp.from(Instant.now()))
+ }
testImplicit(Period.ofYears(2))
testImplicit(Duration.ofMinutes(77))
testImplicit(SaveMode.Append)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]