This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 6dc628c31cdf [SPARK-49809][BUILD] Use `sbt.IO` in `SparkBuild.scala` 
to avoid naming conflicts with `java.io.IO` in Java 23
6dc628c31cdf is described below

commit 6dc628c31cdf48769ccd80cd2b81f7bd6386276f
Author: yangjie01 <[email protected]>
AuthorDate: Fri Sep 27 08:51:47 2024 -0700

    [SPARK-49809][BUILD] Use `sbt.IO` in `SparkBuild.scala` to avoid naming 
conflicts with `java.io.IO` in Java 23
    
    ### What changes were proposed in this pull request?
    This pr change to use `sbt.IO` in `SparkBuild.scala` to avoid naming 
conflicts with `java.io.IO` in Java 23, and after this PR, Spark can be built 
using sbt with Java 23(current pr does not focus on the results of `sbt/test` 
with Java 23)
    
    ### Why are the changes needed?
    Make Spark be compiled using sbt with Java 23.
    
    Because Java 23 has added `java.io.IO`, and `SparkBuild.scala` imports both 
`java.io._` and `sbt._`, this results in the following error when executing
    
    ```
    build/sbt -Phadoop-3 -Phive-thriftserver -Pspark-ganglia-lgpl 
-Pdocker-integration-tests -Pyarn -Pvolcano -Pkubernetes -Pkinesis-asl -Phive 
-Phadoop-cloud Test/package streaming-kinesis-asl-assembly/assembly 
connect/assembly
    ```
    
    with Java 23
    
    ```
    build/sbt -Phadoop-3 -Phive-thriftserver -Pspark-ganglia-lgpl 
-Pdocker-integration-tests -Pyarn -Pvolcano -Pkubernetes -Pkinesis-asl -Phive 
-Phadoop-cloud Test/package streaming-kinesis-asl-assembly/assembly 
connect/assembly
    Using /Users/yangjie01/Tools/zulu23 as default JAVA_HOME.
    Note, this will be overridden by -java-home if it is set.
    [info] welcome to sbt 1.9.3 (Azul Systems, Inc. Java 23)
    [info] loading settings for project global-plugins from idea.sbt ...
    [info] loading global plugins from /Users/yangjie01/.sbt/1.0/plugins
    [info] loading settings for project spark-sbt-build from plugins.sbt ...
    [info] loading project definition from 
/Users/yangjie01/SourceCode/git/spark-sbt/project
    [info] compiling 3 Scala sources to 
/Users/yangjie01/SourceCode/git/spark-sbt/project/target/scala-2.12/sbt-1.0/classes
 ...
    [error] 
/Users/yangjie01/SourceCode/git/spark-sbt/project/SparkBuild.scala:1209:7: 
reference to IO is ambiguous;
    [error] it is imported twice in the same scope by
    [error] import sbt._
    [error] and import java.io._
    [error]       IO.write(file, s"$hadoopProvidedProp = $isHadoopProvided")
    [error]       ^
    [error] one error found
    [error] (Compile / compileIncremental) Compilation failed
    ```
    
    ### Does this PR introduce _any_ user-facing change?
    No
    
    ### How was this patch tested?
    - Pass Github Actions
    - Manual check:
    
    ```
    build/sbt -Phadoop-3 -Phive-thriftserver -Pspark-ganglia-lgpl 
-Pdocker-integration-tests -Pyarn -Pvolcano -Pkubernetes -Pkinesis-asl -Phive 
-Phadoop-cloud Test/package streaming-kinesis-asl-assembly/assembly 
connect/assembly
    ```
    
    with Java 23, after this pr, the aforementioned command can be executed 
successfully.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    No
    
    Closes #48280 from LuciferYang/build-with-java23.
    
    Authored-by: yangjie01 <[email protected]>
    Signed-off-by: Dongjoon Hyun <[email protected]>
---
 project/SparkBuild.scala | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/project/SparkBuild.scala b/project/SparkBuild.scala
index 82950fb30287..6137984a53c0 100644
--- a/project/SparkBuild.scala
+++ b/project/SparkBuild.scala
@@ -1206,7 +1206,7 @@ object YARN {
     genConfigProperties := {
       val file = (Compile / classDirectory).value / 
s"org/apache/spark/deploy/yarn/$propFileName"
       val isHadoopProvided = 
SbtPomKeys.effectivePom.value.getProperties.get(hadoopProvidedProp)
-      IO.write(file, s"$hadoopProvidedProp = $isHadoopProvided")
+      sbt.IO.write(file, s"$hadoopProvidedProp = $isHadoopProvided")
     },
     Compile / copyResources := (Def.taskDyn {
       val c = (Compile / copyResources).value


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to