danny0405 commented on code in PR #5279:
URL: https://github.com/apache/hudi/pull/5279#discussion_r846710007
##########
README.md:
##########
@@ -72,35 +72,46 @@ mvn clean javadoc:aggregate -Pjavadocs
### Build with different Spark versions
-The default Spark version supported is 2.4.4. To build for different Spark
versions and Scala 2.12, use the
-corresponding profile
-
-| Label | Artifact Name for Spark Bundle | Maven Profile Option | Notes |
-|--|--|--|--|
-| Spark 2.4, Scala 2.11 | hudi-spark2.4-bundle_2.11 | `-Pspark2.4` | For
Spark 2.4.4, which is the same as the default |
-| Spark 2.4, Scala 2.12 | hudi-spark2.4-bundle_2.12 | `-Pspark2.4,scala-2.12`
| For Spark 2.4.4, which is the same as the default and Scala 2.12 |
-| Spark 3.1, Scala 2.12 | hudi-spark3.1-bundle_2.12 | `-Pspark3.1` | For Spark
3.1.x |
-| Spark 3.2, Scala 2.12 | hudi-spark3.2-bundle_2.12 | `-Pspark3.2` | For Spark
3.2.x |
-| Spark 3, Scala 2.12 | hudi-spark3-bundle_2.12 | `-Pspark3` | This is the
same as `Spark 3.2, Scala 2.12` |
-| Spark, Scala 2.11 | hudi-spark-bundle_2.11 | Default | The default profile,
supporting Spark 2.4.4 |
-| Spark, Scala 2.12 | hudi-spark-bundle_2.12 | `-Pscala-2.12` | The default
profile (for Spark 2.4.4) with Scala 2.12 |
+The default Spark version supported is 2.4.4. Refer to the table below for
building with different Spark and Scala versions.
+
+| Maven build options | Expected Spark bundle jar name | Notes
|
+|:--------------------------|:-------------------------------|:-------------------------------------------------|
+| (empty) | hudi-spark-bundle_2.11 | For Spark 2.4.4
and Scala 2.11 (default options) |
+| `-Dspark2.4` | hudi-spark2.4-bundle_2.11 | For Spark 2.4.4
and Scala 2.11 (same as default) |
+| `-Dspark2.4 -Dscala-2.12` | hudi-spark2.4-bundle_2.12 | For Spark 2.4.4
and Scala 2.12 |
+| `-Dspark3.1 -Dscala-2.12` | hudi-spark3.1-bundle_2.12 | For Spark 3.1.x
and Scala 2.12 |
+| `-Dspark3.2 -Dscala-2.12` | hudi-spark3.2-bundle_2.12 | For Spark 3.2.x
and Scala 2.12 |
+| `-Dspark3` | hudi-spark3-bundle_2.12 | For Spark 3.2.x
and Scala 2.12 |
+| `-Dscala-2.12` | hudi-spark-bundle_2.12 | For Spark 2.4.4
and Scala 2.12 |
For example,
```
-# Build against Spark 3.2.x (the default build shipped with the public Spark 3
bundle)
-mvn clean package -DskipTests -Pspark3.2
+# Build against Spark 3.2.x
+mvn clean package -DskipTests -Dspark3.2 # Same as -Dspark3, which is
equivalent to Spark 3.2 profile
# Build against Spark 3.1.x
-mvn clean package -DskipTests -Pspark3.1
+mvn clean package -DskipTests -Dspark3.1
# Build against Spark 2.4.4 and Scala 2.12
-mvn clean package -DskipTests -Pspark2.4,scala-2.12
+mvn clean package -DskipTests -Dspark2.4 -Dscala-2.12
```
-### What about "spark-avro" module?
+#### What about "spark-avro" module?
Starting from versions 0.11, Hudi no longer requires `spark-avro` to be
specified using `--packages`
+### Build with different Flink versions
+
+The default Flink version supported is 1.14. Refer to the table below for
building with different Flink and Scala versions.
+
+| Maven build options | Expected Flink bundle jar name | Notes
|
+|:---------------------------|:-------------------------------|:------------------------------------------------|
+| (empty) | hudi-flink1.14-bundle_2.11 | For Flink 1.14
and Scala 2.11 (default options) |
+| `-Dflink1.14` | hudi-flink1.14-bundle_2.11 | For Flink 1.14
and Scala 2.11 (same as default) |
+| `-Dflink1.14 -Dscala-2.12` | hudi-flink1.14-bundle_2.12 | For Flink 1.14
and Scala 2.12 |
+| `-Dflink1.13` | hudi-flink1.13-bundle_2.11 | For Flink 1.13
and Scala 2.11 |
+| `-Dflink1.13 -Dscala-2.12` | hudi-flink1.14-bundle_2.12 | For Flink 1.13
and Scala 2.12 |
Review Comment:
Thanks for the build profile fix :)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]