Copilot commented on code in PR #2018:
URL: https://github.com/apache/auron/pull/2018#discussion_r2867287331


##########
dev/reformat:
##########
@@ -52,12 +52,20 @@ fi
 sparkver=spark-3.5
 for celebornver in celeborn-0.5 celeborn-0.6
 do
-  run_maven -P"${sparkver}" -Pceleborn,"${celebornver}" -Puniffle,uniffle-0.10 
-Ppaimon,paimon-1.2 -Pflink,flink-1.18 -Piceberg,iceberg-1.9
+  run_maven -P"${sparkver}" -Pceleborn,"${celebornver}" -Puniffle,uniffle-0.10 
-Ppaimon,paimon-1.2 -Pflink-1.18 -Piceberg-1.9
 
 done
 
-sparkvers=(spark-3.0 spark-3.1 spark-3.2 spark-3.3 spark-3.4)
+sparkvers=(spark-3.0 spark-3.1 spark-3.2 spark-3.3 spark-3.4 spark-4.0 
spark-4.1)
 for sparkver in "${sparkvers[@]}"
 do
+  if [[ $sparkver == spark-4.* ]]; then
+    SCALA_PROFILE=scala-2.13
+    # Use JDK 21 for Spark 4.x to match the TPC-DS CI configuration
+    export JAVA_HOME=$(/usr/libexec/java_home -v 21)
+  else

Review Comment:
   The comment says Spark 4.x should use JDK 21, but the PR description (and 
Maven enforcer rules for Spark 4.0/4.1) require JDK 17+. Using 21 may be fine, 
but it’s inconsistent with the stated requirement and with the style workflow 
(which installs 17). Consider switching this to JDK 17 for Spark 4.x, or 
updating CI/workflow and docs to consistently use 21.



##########
dev/reformat:
##########
@@ -52,12 +52,20 @@ fi
 sparkver=spark-3.5
 for celebornver in celeborn-0.5 celeborn-0.6
 do
-  run_maven -P"${sparkver}" -Pceleborn,"${celebornver}" -Puniffle,uniffle-0.10 
-Ppaimon,paimon-1.2 -Pflink,flink-1.18 -Piceberg,iceberg-1.9
+  run_maven -P"${sparkver}" -Pceleborn,"${celebornver}" -Puniffle,uniffle-0.10 
-Ppaimon,paimon-1.2 -Pflink-1.18 -Piceberg-1.9
 
 done
 
-sparkvers=(spark-3.0 spark-3.1 spark-3.2 spark-3.3 spark-3.4)
+sparkvers=(spark-3.0 spark-3.1 spark-3.2 spark-3.3 spark-3.4 spark-4.0 
spark-4.1)
 for sparkver in "${sparkvers[@]}"
 do
+  if [[ $sparkver == spark-4.* ]]; then
+    SCALA_PROFILE=scala-2.13
+    # Use JDK 21 for Spark 4.x to match the TPC-DS CI configuration
+    export JAVA_HOME=$(/usr/libexec/java_home -v 21)
+  else
+    SCALA_PROFILE=scala-2.12
+    export JAVA_HOME=$(/usr/libexec/java_home -v 1.8)
+  fi

Review Comment:
   `/usr/libexec/java_home` is macOS-specific. Since this script is invoked by 
CI on Ubuntu, these lines will fail (command not found) under `set -e`, 
breaking `./dev/reformat --check`. Consider making JAVA_HOME selection 
OS-agnostic (e.g., detect `java_home` availability and otherwise use paths 
provided by CI such as setup-java outputs / exported env vars), and update PATH 
if needed when switching JDKs.



##########
.github/workflows/style.yml:
##########
@@ -46,5 +46,12 @@ jobs:
           java-version: 8
           cache: 'maven'
           check-latest: false
+      - name: Setup JDK 17
+        uses: actions/setup-java@v5
+        with:
+          distribution: 'adopt-hotspot'
+          java-version: 17
+          cache: 'maven'
+          check-latest: false

Review Comment:
   This workflow installs JDK 8 and then JDK 17, but `dev/reformat` attempts to 
switch JAVA_HOME internally. As written, the script’s switching logic is 
macOS-specific and (even after fixing that) it will need reliable JDK home 
paths on Ubuntu (e.g., via setup-java step outputs or exported env vars) to 
toggle between versions. Consider wiring the JDK home paths into the 
environment (or splitting the style job by Spark/JDK matrix) so the reformat 
script can deterministically select the right JDK.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to