LuciferYang commented on PR #45817:
URL: https://github.com/apache/spark/pull/45817#issuecomment-2291395844

   That `reflection` is used as a workaround for a cross-compilation testing 
scenario: maven build or test using Java 21 with `-release 17`.
   
   The specific problem scenario can be reproduced as follows:
   
   1. `git reset --hard 4c1405d10b8fceba7e1486bd4e2e2a24596a71a7`
   2. Revert the changes made to `SparkDateTimeUtils `
   3. maven build the `sql/api` module of the project using Java 21: `build/mvn 
clean install -DskipTests -pl sql/api -am`
   
   You may see the compilation exception like:
   
   ```
   [INFO] compiling 94 Scala sources and 42 Java sources to 
/spark-source/sql/api/target/scala-2.13/classes ...
   [ERROR] [Error] 
/spark-source/sql/api/src/main/scala/org/apache/spark/sql/catalyst/util/SparkDateTimeUtils.scala:27:
 object util is not a member of package sun
   [ERROR] [Error] 
/spark-source/sql/api/src/main/scala/org/apache/spark/sql/catalyst/util/SparkDateTimeUtils.scala:218:
 not found: type ZoneInfo
   [ERROR] [Error] 
/spark-source/sql/api/src/main/scala/org/apache/spark/sql/catalyst/util/SparkDateTimeUtils.scala:218:
 value getOffsetsByWall is not a member of java.util.TimeZone
   [ERROR] [Error] 
/spark-source/sql/api/src/main/scala/org/apache/spark/sql/catalyst/util/SparkDateTimeUtils.scala:27:
 Unused import
   Applicable -Wconf / @nowarn filters for this fatal warning: msg=<part of the 
message>, cat=unused-imports, site=org.apache.spark.sql.catalyst.util
   [ERROR] four errors found
   [INFO] 
------------------------------------------------------------------------
   [INFO] Reactor Summary for Spark Project Parent POM 4.0.0-SNAPSHOT:
   [INFO] 
   [INFO] Spark Project Parent POM ........................... SUCCESS [ 27.177 
s]
   [INFO] Spark Project Tags ................................. SUCCESS [ 11.821 
s]
   [INFO] Spark Project Common Utils ......................... SUCCESS [ 12.104 
s]
   [INFO] Spark Project Variant .............................. SUCCESS [  3.165 
s]
   [INFO] Spark Project Unsafe ............................... SUCCESS [  8.262 
s]
   [INFO] Spark Project SQL API .............................. FAILURE [ 12.878 
s]
   [INFO] 
------------------------------------------------------------------------
   [INFO] BUILD FAILURE
   [INFO] 
------------------------------------------------------------------------
   [INFO] Total time:  01:15 min
   [INFO] Finished at: 2024-08-15T22:05:08+08:00
   [INFO] 
------------------------------------------------------------------------
   [ERROR] Failed to execute goal 
net.alchim31.maven:scala-maven-plugin:4.8.1:compile (scala-compile-first) on 
project spark-sql-api_2.13: Execution scala-compile-first of goal 
net.alchim31.maven:scala-maven-plugin:4.8.1:compile failed: 
org.apache.commons.exec.ExecuteException: Process exited with an error: 255 
(Exit value: 255) -> [Help 1]
   ```
   
   So, in order to make this cross-build and testing scenario work normally 
without changing the original logic, reflection is used there.
   
   The current PR was an attempt I made later, but since I am not very familiar 
with the background of why special handling for `sun.util.calendar.ZoneInfo` is 
needed here, I ultimately gave up on this pr.
   
   
   Additionally, I would like to clarify something regarding ut:
   1. If you are executing them using maven or sbt, you do not need to add 
`--add-opens`.
   2. After the merge of 
[https://github.com/apache/spark/pull/46457](https://github.com/apache/spark/pull/46457),
 if you are run ut in IDE, there should be no difference from before: you may 
only need to manually add `--add-opens` to the `VM options` in tests that 
actually use the `SparkDateTimeUtils#toJavaDate` function(Because Spark 4.0 
uses Java 17). This is because the IDE may not necessarily pick up the Java 
options configured in `pom.xml` or `SparkBuild.scala` correctly.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to