yihua commented on code in PR #12772:
URL: https://github.com/apache/hudi/pull/12772#discussion_r2074091086
##########
.github/workflows/bot.yml:
##########
@@ -964,33 +970,50 @@ jobs:
include:
- sparkProfile: 'spark3.5'
sparkArchive: 'spark-3.5.3/spark-3.5.3-bin-hadoop3.tgz'
+ scalaProfile: '-Dscala-2.12 -Dscala.binary.version=2.12'
Review Comment:
Let's add a new job for integration tests with Java 17 without affecting
Spark 3.5 tests on Java 8?
##########
.github/workflows/bot.yml:
##########
@@ -361,21 +361,24 @@ jobs:
- scalaProfile: "scala-2.12"
sparkProfile: "spark3.4"
sparkModules: "hudi-spark-datasource/hudi-spark3.4.x"
+ - scalaProfile: "scala-2.13"
Review Comment:
Java tests in GH CI are split to two jobs:
`test-spark-java17-java-unit-tests` and
`test-spark-java17-java-functional-tests`. `spark4` profile should also be
added to `test-spark-java17-java-unit-tests`.
##########
.github/workflows/bot.yml:
##########
@@ -964,33 +970,50 @@ jobs:
include:
- sparkProfile: 'spark3.5'
sparkArchive: 'spark-3.5.3/spark-3.5.3-bin-hadoop3.tgz'
+ scalaProfile: '-Dscala-2.12 -Dscala.binary.version=2.12'
+
+# TODO: Integration tests on Spark4 do not work on existing hadoop/hive/spark
docker images from apachehudi dockerhub repo:
+#
+# java.lang.NoSuchFieldError: NVDIMM
+# at
org.apache.hudi.integ.testsuite.TestDFSHoodieTestSuiteWriterAdapter.initClass(TestDFSHoodieTestSuiteWriterAdapter.java:67)
+# [ERROR] Errors:
+# [ERROR]
TestDFSHoodieTestSuiteWriterAdapter.initClass:67->UtilitiesTestBase.initTestServices:152
» NoSuchField
+# [ERROR]
TestFileDeltaInputWriter.initClass:62->UtilitiesTestBase.initTestServices:152 »
NoSuchField
+# [ERROR]
TestDFSAvroDeltaInputReader.initClass:47->UtilitiesTestBase.initTestServices:152
» NoSuchField
+# [ERROR]
TestDFSHoodieDatasetInputReader.initClass:56->UtilitiesTestBase.initTestServices:152
» NoSuchField
+# [ERROR] Tests run: 21, Failures: 0, Errors: 4, Skipped: 2
+
+# - sparkProfile: 'spark4'
+# sparkArchive:
'spark-4.0.0-preview1/spark-4.0.0-preview1-bin-hadoop3.tgz'
+# scalaProfile: '-Dscala-2.13 -Dscala.binary.version=2.13'
Review Comment:
Let's put this into a JIRA ticket for tracking instead of adding comments
here.
##########
.github/workflows/bot.yml:
##########
@@ -964,33 +970,50 @@ jobs:
include:
- sparkProfile: 'spark3.5'
sparkArchive: 'spark-3.5.3/spark-3.5.3-bin-hadoop3.tgz'
+ scalaProfile: '-Dscala-2.12 -Dscala.binary.version=2.12'
+
+# TODO: Integration tests on Spark4 do not work on existing hadoop/hive/spark
docker images from apachehudi dockerhub repo:
+#
+# java.lang.NoSuchFieldError: NVDIMM
+# at
org.apache.hudi.integ.testsuite.TestDFSHoodieTestSuiteWriterAdapter.initClass(TestDFSHoodieTestSuiteWriterAdapter.java:67)
+# [ERROR] Errors:
+# [ERROR]
TestDFSHoodieTestSuiteWriterAdapter.initClass:67->UtilitiesTestBase.initTestServices:152
» NoSuchField
+# [ERROR]
TestFileDeltaInputWriter.initClass:62->UtilitiesTestBase.initTestServices:152 »
NoSuchField
+# [ERROR]
TestDFSAvroDeltaInputReader.initClass:47->UtilitiesTestBase.initTestServices:152
» NoSuchField
+# [ERROR]
TestDFSHoodieDatasetInputReader.initClass:56->UtilitiesTestBase.initTestServices:152
» NoSuchField
+# [ERROR] Tests run: 21, Failures: 0, Errors: 4, Skipped: 2
+
+# - sparkProfile: 'spark4'
+# sparkArchive:
'spark-4.0.0-preview1/spark-4.0.0-preview1-bin-hadoop3.tgz'
+# scalaProfile: '-Dscala-2.13 -Dscala.binary.version=2.13'
Review Comment:
Does this mean that Spark 4 + Scala 2.13 is not supported yet?
##########
.github/workflows/bot.yml:
##########
@@ -422,21 +425,24 @@ jobs:
- scalaProfile: "scala-2.12"
sparkProfile: "spark3.4"
sparkModules: "hudi-spark-datasource/hudi-spark3.4.x"
+ - scalaProfile: "scala-2.13"
+ sparkProfile: "spark4"
+ sparkModules: "hudi-spark-datasource/hudi-spark4.0.x"
steps:
- uses: actions/checkout@v3
- - name: Set up JDK 8
+ - name: Set up JDK 17
uses: actions/setup-java@v3
with:
- java-version: '8'
+ java-version: '17'
distribution: 'temurin'
architecture: x64
- name: Build Project
env:
SCALA_PROFILE: ${{ matrix.scalaProfile }}
SPARK_PROFILE: ${{ matrix.sparkProfile }}
run:
- mvn clean install -T 2 -D"$SCALA_PROFILE" -D"$SPARK_PROFILE"
-DskipTests=true $MVN_ARGS -am -pl
"hudi-examples/hudi-examples-spark,hudi-common,$SPARK_COMMON_MODULES,$SPARK_MODULES"
+ mvn clean install -T 2 -Pjava17 -D"$SCALA_PROFILE"
-D"$SPARK_PROFILE" -DskipTests=true $MVN_ARGS -am -pl
"hudi-examples/hudi-examples-spark,hudi-common,$SPARK_COMMON_MODULES,$SPARK_MODULES"
Review Comment:
Let's add tests on Spark 4 in `test-spark-java17-scala-other-tests`
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]