This is an automated email from the ASF dual-hosted git repository.

yihua pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/hudi.git


The following commit(s) were added to refs/heads/master by this push:
     new b476f9aeaa0 [HUDI-6341] Update README with Spark 3.4 support (#8957)
b476f9aeaa0 is described below

commit b476f9aeaa0f1c0df62619ccee1ba829596d90d8
Author: Y Ethan Guo <[email protected]>
AuthorDate: Tue Jun 13 21:12:49 2023 -0700

    [HUDI-6341] Update README with Spark 3.4 support (#8957)
---
 README.md | 14 ++++++++++----
 1 file changed, 10 insertions(+), 4 deletions(-)

diff --git a/README.md b/README.md
index 4526b9517d8..ff2b95ec547 100644
--- a/README.md
+++ b/README.md
@@ -85,7 +85,7 @@ mvn clean javadoc:aggregate -Pjavadocs
 ### Build with different Spark versions
 
 The default Spark 2.x version supported is 2.4.4. The default Spark 3.x 
version, corresponding to `spark3` profile is
-3.3.1. The default Scala version is 2.12. Refer to the table below for 
building with different Spark and Scala versions.
+3.4.0. The default Scala version is 2.12. Refer to the table below for 
building with different Spark and Scala versions.
 
 | Maven build options       | Expected Spark bundle jar name               | 
Notes                                            |
 
|:--------------------------|:---------------------------------------------|:-------------------------------------------------|
@@ -95,17 +95,18 @@ The default Spark 2.x version supported is 2.4.4. The 
default Spark 3.x version,
 | `-Dspark3.1`              | hudi-spark3.1-bundle_2.12                    | 
For Spark 3.1.x and Scala 2.12                   |
 | `-Dspark3.2`              | hudi-spark3.2-bundle_2.12                    | 
For Spark 3.2.x and Scala 2.12 (same as default) |
 | `-Dspark3.3`              | hudi-spark3.3-bundle_2.12                    | 
For Spark 3.3.x and Scala 2.12                   |
+| `-Dspark3.4`              | hudi-spark3.4-bundle_2.12                    | 
For Spark 3.4.x and Scala 2.12                   |
 | `-Dspark2 -Dscala-2.11`   | hudi-spark-bundle_2.11 (legacy bundle name)  | 
For Spark 2.4.4 and Scala 2.11                   |
 | `-Dspark2 -Dscala-2.12`   | hudi-spark-bundle_2.12 (legacy bundle name)  | 
For Spark 2.4.4 and Scala 2.12                   |
-| `-Dspark3`                | hudi-spark3-bundle_2.12 (legacy bundle name) | 
For Spark 3.3.x and Scala 2.12                   |
+| `-Dspark3`                | hudi-spark3-bundle_2.12 (legacy bundle name) | 
For Spark 3.4.x and Scala 2.12                   |
 
 For example,
 ```
 # Build against Spark 3.2.x
 mvn clean package -DskipTests
 
-# Build against Spark 3.1.x
-mvn clean package -DskipTests -Dspark3.1
+# Build against Spark 3.4.x
+mvn clean package -DskipTests -Dspark3.4
 
 # Build against Spark 2.4.4 and Scala 2.11
 mvn clean package -DskipTests -Dspark2.4 -Dscala-2.11
@@ -156,6 +157,11 @@ Functional tests, which are tagged with 
`@Tag("functional")`, can be run with ma
 mvn -Pfunctional-tests test
 ```
 
+Integration tests can be run with maven profile `integration-tests`.
+```
+mvn -Pintegration-tests verify
+```
+
 To run tests with spark event logging enabled, define the Spark event log 
directory. This allows visualizing test DAG and stages using Spark History 
Server UI.
 ```
 mvn -Punit-tests test -DSPARK_EVLOG_DIR=/path/for/spark/event/log

Reply via email to