This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 6960a50a0fd [SPARK-45277][BUILD][INFRA] Install Java 17 to support 
SparkR testing on Windows
6960a50a0fd is described below

commit 6960a50a0fdcef91e3a88fee848bbf13a8dc62ca
Author: yangjie01 <[email protected]>
AuthorDate: Fri Sep 22 10:10:04 2023 -0700

    [SPARK-45277][BUILD][INFRA] Install Java 17 to support SparkR testing on 
Windows
    
    ### What changes were proposed in this pull request?
    This PR adds the step of installing Java 17 in 
`appveyor-install-dependencies.ps1` to make SparkR can use Java 17 for testing 
on Windows.
    
    On the other hand, this pr corrects the `log4j2` configuration file name 
used for testing in `appveyor.yml`.
    
    ### Why are the changes needed?
    Apache Spark supports Java 17 at a minimum, so SparkR also needs to use 
Java 17 for testing on Windows. On the other hand, the current version of 
Windows in use does not come with Java 17 pre-installed, so this PR carried out 
a manual installation.
    
    ### Does this PR introduce _any_ user-facing change?
    No
    
    ### How was this patch tested?
    `continuous-integration/appveyor/pr` should test pass
    
    ### Was this patch authored or co-authored using generative AI tooling?
    No
    
    Closes #43056 from LuciferYang/SPARK-45277.
    
    Authored-by: yangjie01 <[email protected]>
    Signed-off-by: Dongjoon Hyun <[email protected]>
---
 appveyor.yml                          |  2 +-
 dev/appveyor-install-dependencies.ps1 | 13 ++++++++++++-
 2 files changed, 13 insertions(+), 2 deletions(-)

diff --git a/appveyor.yml b/appveyor.yml
index fdb247d5d43..762e4cf55b9 100644
--- a/appveyor.yml
+++ b/appveyor.yml
@@ -66,7 +66,7 @@ environment:
   R_REMOTES_NO_ERRORS_FROM_WARNINGS: true
 
 test_script:
-  - cmd: .\bin\spark-submit2.cmd --driver-java-options 
"-Dlog4j.configuration=file:///%CD:\=/%/R/log4j.properties" --conf 
spark.hadoop.fs.defaultFS="file:///" R\pkg\tests\run-all.R
+  - cmd: .\bin\spark-submit2.cmd --driver-java-options 
"-Dlog4j.configuration=file:///%CD:\=/%/R/log4j2.properties" --conf 
spark.hadoop.fs.defaultFS="file:///" R\pkg\tests\run-all.R
 
 notifications:
   - provider: Email
diff --git a/dev/appveyor-install-dependencies.ps1 
b/dev/appveyor-install-dependencies.ps1
index 682d388bdf9..c07405e01e9 100644
--- a/dev/appveyor-install-dependencies.ps1
+++ b/dev/appveyor-install-dependencies.ps1
@@ -94,9 +94,20 @@ if (!(Test-Path $tools)) {
 #
 # Pop-Location
 
-# ========================== SBT
 Push-Location $tools
 
+# ========================== Java 17
+$zuluFileName="zulu17.44.53-ca-jdk17.0.8.1-win_x64"
+Start-FileDownload "https://cdn.azul.com/zulu/bin/$zuluFileName.zip"; "zulu.zip"
+
+# extract
+Invoke-Expression "7z.exe x zulu.zip"
+
+#add java 17 to environment variables
+$env:JAVA_HOME = "$tools\$zuluFileName"
+$env:PATH = "$JAVA_HOME\bin;" + $env:PATH
+
+# ========================== SBT
 $sbtVer = "1.9.3"
 Start-FileDownload 
"https://github.com/sbt/sbt/releases/download/v$sbtVer/sbt-$sbtVer.zip"; 
"sbt.zip"
 


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to