This is an automated email from the ASF dual-hosted git repository.
xushiyan pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/hudi.git
The following commit(s) were added to refs/heads/asf-site by this push:
new 6645380ef2 [MINOR] Fixing 0.12.1 docs (#7054)
6645380ef2 is described below
commit 6645380ef21508fb6fc646aa42692fa9e3a53f6c
Author: Zhaojing Yu <[email protected]>
AuthorDate: Tue Oct 25 11:53:04 2022 +0800
[MINOR] Fixing 0.12.1 docs (#7054)
---
website/docs/cli.md | 2 +-
website/docs/flink-quick-start-guide.md | 6 ++---
website/docs/metadata_indexing.md | 8 +++----
website/docs/query_engine_setup.md | 2 +-
website/docs/quick-start-guide.md | 26 +++++++++++-----------
website/versioned_docs/version-0.12.1/cli.md | 2 +-
.../version-0.12.1/flink-quick-start-guide.md | 6 ++---
.../version-0.12.1/metadata_indexing.md | 8 +++----
.../version-0.12.1/query_engine_setup.md | 2 +-
.../version-0.12.1/quick-start-guide.md | 26 +++++++++++-----------
10 files changed, 44 insertions(+), 44 deletions(-)
diff --git a/website/docs/cli.md b/website/docs/cli.md
index 1d64d96e12..4731ecdc85 100644
--- a/website/docs/cli.md
+++ b/website/docs/cli.md
@@ -545,7 +545,7 @@ The following table shows the Hudi table versions
corresponding to the Hudi rele
| Hudi Table Version | Hudi Release Version(s) |
|:-------------------|:------------------------|
-| `FIVE` or `5` | 0.12.0 and above |
+| `FIVE` or `5` | 0.12.x |
| `FOUR` or `4` | 0.11.x |
| `THREE` or `3` | 0.10.x |
| `TWO` or `2` | 0.9.x |
diff --git a/website/docs/flink-quick-start-guide.md
b/website/docs/flink-quick-start-guide.md
index e8fb8829e9..5108f3f893 100644
--- a/website/docs/flink-quick-start-guide.md
+++ b/website/docs/flink-quick-start-guide.md
@@ -96,7 +96,7 @@ dependency to your project:
<dependency>
<groupId>org.apache.hudi</groupId>
<artifactId>hudi-flink1.13-bundle</artifactId>
- <version>0.12.0</version>
+ <version>0.12.1</version>
</dependency>
```
@@ -105,7 +105,7 @@ dependency to your project:
<dependency>
<groupId>org.apache.hudi</groupId>
<artifactId>hudi-flink1.14-bundle</artifactId>
- <version>0.12.0</version>
+ <version>0.12.1</version>
</dependency>
```
@@ -114,7 +114,7 @@ dependency to your project:
<dependency>
<groupId>org.apache.hudi</groupId>
<artifactId>hudi-flink1.15-bundle</artifactId>
- <version>0.12.0</version>
+ <version>0.12.1</version>
</dependency>
```
diff --git a/website/docs/metadata_indexing.md
b/website/docs/metadata_indexing.md
index 73c091a09f..a505a0d082 100644
--- a/website/docs/metadata_indexing.md
+++ b/website/docs/metadata_indexing.md
@@ -45,7 +45,7 @@ hoodie.write.lock.zookeeper.base_path=<zk_base_path>
```bash
spark-submit \
---class org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer `ls
/Users/home/path/to/hudi-utilities-bundle/target/hudi-utilities-bundle_2.11-0.12.0-SNAPSHOT.jar`
\
+--class org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer `ls
/Users/home/path/to/hudi-utilities-bundle/target/hudi-utilities-bundle_2.11-0.12.1-SNAPSHOT.jar`
\
--props `ls /Users/home/path/to/write/config.properties` \
--source-class org.apache.hudi.utilities.sources.ParquetDFSSource
--schemaprovider-class org.apache.hudi.utilities.schema.FilebasedSchemaProvider
\
--source-ordering-field tpep_dropoff_datetime \
@@ -91,7 +91,7 @@ Now, we can schedule indexing using `HoodieIndexer` in
`schedule` mode as follow
```
spark-submit \
--class org.apache.hudi.utilities.HoodieIndexer \
-/Users/home/path/to/hudi-utilities-bundle/target/hudi-utilities-bundle_2.11-0.12.0-SNAPSHOT.jar
\
+/Users/home/path/to/hudi-utilities-bundle/target/hudi-utilities-bundle_2.11-0.12.1-SNAPSHOT.jar
\
--props /Users/home/path/to/indexer.properties \
--mode schedule \
--base-path /tmp/hudi-ny-taxi \
@@ -110,7 +110,7 @@ To execute indexing, run the indexer in `execute` mode as
below.
```
spark-submit \
--class org.apache.hudi.utilities.HoodieIndexer \
-/Users/home/path/to/hudi-utilities-bundle/target/hudi-utilities-bundle_2.11-0.12.0-SNAPSHOT.jar
\
+/Users/home/path/to/hudi-utilities-bundle/target/hudi-utilities-bundle_2.11-0.12.1-SNAPSHOT.jar
\
--props /Users/home/path/to/indexer.properties \
--mode execute \
--base-path /tmp/hudi-ny-taxi \
@@ -165,7 +165,7 @@ To drop an index, just run the index in `dropindex` mode.
```
spark-submit \
--class org.apache.hudi.utilities.HoodieIndexer \
-/Users/home/path/to/hudi-utilities-bundle/target/hudi-utilities-bundle_2.11-0.12.0-SNAPSHOT.jar
\
+/Users/home/path/to/hudi-utilities-bundle/target/hudi-utilities-bundle_2.11-0.12.1-SNAPSHOT.jar
\
--props /Users/home/path/to/indexer.properties \
--mode dropindex \
--base-path /tmp/hudi-ny-taxi \
diff --git a/website/docs/query_engine_setup.md
b/website/docs/query_engine_setup.md
index 40cdabbd43..5df8c7623b 100644
--- a/website/docs/query_engine_setup.md
+++ b/website/docs/query_engine_setup.md
@@ -100,7 +100,7 @@ to `org.apache.hadoop.hive.ql.io.HiveInputFormat`. Then
proceed to query the tab
## Redshift Spectrum
-Copy on Write Tables in Apache Hudi versions 0.5.2, 0.6.0, 0.7.0, 0.8.0,
0.9.0, 0.10.x, 0.11.x and 0.12.0 can be queried via Amazon Redshift Spectrum
external tables.
+Copy on Write Tables in Apache Hudi versions 0.5.2, 0.6.0, 0.7.0, 0.8.0,
0.9.0, 0.10.x, 0.11.x and 0.12.x can be queried via Amazon Redshift Spectrum
external tables.
To be able to query Hudi versions 0.10.0 and above please try latest versions
of Redshift.
:::note
Hudi tables are supported only when AWS Glue Data Catalog is used. It's not
supported when you use an Apache Hive metastore as the external catalog.
diff --git a/website/docs/quick-start-guide.md
b/website/docs/quick-start-guide.md
index ed7bb29698..c610964f6c 100644
--- a/website/docs/quick-start-guide.md
+++ b/website/docs/quick-start-guide.md
@@ -50,7 +50,7 @@ From the extracted directory run spark-shell with Hudi:
```shell
# Spark 3.3
spark-shell \
- --packages org.apache.hudi:hudi-spark3.3-bundle_2.12:0.12.0 \
+ --packages org.apache.hudi:hudi-spark3.3-bundle_2.12:0.12.1 \
--conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' \
--conf
'spark.sql.catalog.spark_catalog=org.apache.spark.sql.hudi.catalog.HoodieCatalog'
\
--conf
'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension'
@@ -58,7 +58,7 @@ spark-shell \
```shell
# Spark 3.2
spark-shell \
- --packages org.apache.hudi:hudi-spark3.2-bundle_2.12:0.12.0 \
+ --packages org.apache.hudi:hudi-spark3.2-bundle_2.12:0.12.1 \
--conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' \
--conf
'spark.sql.catalog.spark_catalog=org.apache.spark.sql.hudi.catalog.HoodieCatalog'
\
--conf
'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension'
@@ -66,14 +66,14 @@ spark-shell \
```shell
# Spark 3.1
spark-shell \
- --packages org.apache.hudi:hudi-spark3.1-bundle_2.12:0.12.0 \
+ --packages org.apache.hudi:hudi-spark3.1-bundle_2.12:0.12.1 \
--conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' \
--conf
'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension'
```
```shell
# Spark 2.4
spark-shell \
- --packages org.apache.hudi:hudi-spark2.4-bundle_2.11:0.12.0 \
+ --packages org.apache.hudi:hudi-spark2.4-bundle_2.11:0.12.1 \
--conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' \
--conf
'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension'
```
@@ -87,7 +87,7 @@ From the extracted directory run pyspark with Hudi:
# Spark 3.3
export PYSPARK_PYTHON=$(which python3)
pyspark \
---packages org.apache.hudi:hudi-spark3.3-bundle_2.12:0.12.0 \
+--packages org.apache.hudi:hudi-spark3.3-bundle_2.12:0.12.1 \
--conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' \
--conf
'spark.sql.catalog.spark_catalog=org.apache.spark.sql.hudi.catalog.HoodieCatalog'
\
--conf
'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension'
@@ -96,7 +96,7 @@ pyspark \
# Spark 3.2
export PYSPARK_PYTHON=$(which python3)
pyspark \
---packages org.apache.hudi:hudi-spark3.2-bundle_2.12:0.12.0 \
+--packages org.apache.hudi:hudi-spark3.2-bundle_2.12:0.12.1 \
--conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' \
--conf
'spark.sql.catalog.spark_catalog=org.apache.spark.sql.hudi.catalog.HoodieCatalog'
\
--conf
'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension'
@@ -105,7 +105,7 @@ pyspark \
# Spark 3.1
export PYSPARK_PYTHON=$(which python3)
pyspark \
---packages org.apache.hudi:hudi-spark3.1-bundle_2.12:0.12.0 \
+--packages org.apache.hudi:hudi-spark3.1-bundle_2.12:0.12.1 \
--conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' \
--conf
'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension'
```
@@ -113,7 +113,7 @@ pyspark \
# Spark 2.4
export PYSPARK_PYTHON=$(which python3)
pyspark \
---packages org.apache.hudi:hudi-spark2.4-bundle_2.11:0.12.0 \
+--packages org.apache.hudi:hudi-spark2.4-bundle_2.11:0.12.1 \
--conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' \
--conf
'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension'
```
@@ -126,27 +126,27 @@ From the extracted directory run Spark SQL with Hudi:
```shell
# Spark 3.3
-spark-sql --packages org.apache.hudi:hudi-spark3.3-bundle_2.12:0.12.0 \
+spark-sql --packages org.apache.hudi:hudi-spark3.3-bundle_2.12:0.12.1 \
--conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' \
--conf
'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension' \
--conf
'spark.sql.catalog.spark_catalog=org.apache.spark.sql.hudi.catalog.HoodieCatalog'
```
```shell
# Spark 3.2
-spark-sql --packages org.apache.hudi:hudi-spark3.2-bundle_2.12:0.12.0 \
+spark-sql --packages org.apache.hudi:hudi-spark3.2-bundle_2.12:0.12.1 \
--conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' \
--conf
'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension' \
--conf
'spark.sql.catalog.spark_catalog=org.apache.spark.sql.hudi.catalog.HoodieCatalog'
```
```shell
# Spark 3.1
-spark-sql --packages org.apache.hudi:hudi-spark3.1-bundle_2.12:0.12.0 \
+spark-sql --packages org.apache.hudi:hudi-spark3.1-bundle_2.12:0.12.1 \
--conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' \
--conf
'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension'
```
```shell
# Spark 2.4
-spark-sql --packages org.apache.hudi:hudi-spark2.4-bundle_2.11:0.12.0 \
+spark-sql --packages org.apache.hudi:hudi-spark2.4-bundle_2.11:0.12.1 \
--conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' \
--conf
'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension'
```
@@ -1390,7 +1390,7 @@ more details please refer to [procedures](procedures).
You can also do the quickstart by [building hudi
yourself](https://github.com/apache/hudi#building-apache-hudi-from-source),
and using `--jars <path to
hudi_code>/packaging/hudi-spark-bundle/target/hudi-spark3.2-bundle_2.1?-*.*.*-SNAPSHOT.jar`
in the spark-shell command above
-instead of `--packages org.apache.hudi:hudi-spark3.2-bundle_2.12:0.12.0`. Hudi
also supports scala 2.12. Refer [build with scala
2.12](https://github.com/apache/hudi#build-with-different-spark-versions)
+instead of `--packages org.apache.hudi:hudi-spark3.2-bundle_2.12:0.12.1`. Hudi
also supports scala 2.12. Refer [build with scala
2.12](https://github.com/apache/hudi#build-with-different-spark-versions)
for more info.
Also, we used Spark here to show case the capabilities of Hudi. However, Hudi
can support multiple table types/query types and
diff --git a/website/versioned_docs/version-0.12.1/cli.md
b/website/versioned_docs/version-0.12.1/cli.md
index 9d1e98198b..071d95e993 100644
--- a/website/versioned_docs/version-0.12.1/cli.md
+++ b/website/versioned_docs/version-0.12.1/cli.md
@@ -501,7 +501,7 @@ The following table shows the Hudi table versions
corresponding to the Hudi rele
| Hudi Table Version | Hudi Release Version(s) |
|:-------------------|:------------------------|
-| `FIVE` or `5` | 0.12.0 and above |
+| `FIVE` or `5` | 0.12.x |
| `FOUR` or `4` | 0.11.x |
| `THREE` or `3` | 0.10.x |
| `TWO` or `2` | 0.9.x |
diff --git a/website/versioned_docs/version-0.12.1/flink-quick-start-guide.md
b/website/versioned_docs/version-0.12.1/flink-quick-start-guide.md
index 2f33027cf6..c2b965f737 100644
--- a/website/versioned_docs/version-0.12.1/flink-quick-start-guide.md
+++ b/website/versioned_docs/version-0.12.1/flink-quick-start-guide.md
@@ -96,7 +96,7 @@ dependency to your project:
<dependency>
<groupId>org.apache.hudi</groupId>
<artifactId>hudi-flink1.13-bundle</artifactId>
- <version>0.12.0</version>
+ <version>0.12.1</version>
</dependency>
```
@@ -105,7 +105,7 @@ dependency to your project:
<dependency>
<groupId>org.apache.hudi</groupId>
<artifactId>hudi-flink1.14-bundle</artifactId>
- <version>0.12.0</version>
+ <version>0.12.1</version>
</dependency>
```
@@ -114,7 +114,7 @@ dependency to your project:
<dependency>
<groupId>org.apache.hudi</groupId>
<artifactId>hudi-flink1.15-bundle</artifactId>
- <version>0.12.0</version>
+ <version>0.12.1</version>
</dependency>
```
diff --git a/website/versioned_docs/version-0.12.1/metadata_indexing.md
b/website/versioned_docs/version-0.12.1/metadata_indexing.md
index 73c091a09f..a505a0d082 100644
--- a/website/versioned_docs/version-0.12.1/metadata_indexing.md
+++ b/website/versioned_docs/version-0.12.1/metadata_indexing.md
@@ -45,7 +45,7 @@ hoodie.write.lock.zookeeper.base_path=<zk_base_path>
```bash
spark-submit \
---class org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer `ls
/Users/home/path/to/hudi-utilities-bundle/target/hudi-utilities-bundle_2.11-0.12.0-SNAPSHOT.jar`
\
+--class org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer `ls
/Users/home/path/to/hudi-utilities-bundle/target/hudi-utilities-bundle_2.11-0.12.1-SNAPSHOT.jar`
\
--props `ls /Users/home/path/to/write/config.properties` \
--source-class org.apache.hudi.utilities.sources.ParquetDFSSource
--schemaprovider-class org.apache.hudi.utilities.schema.FilebasedSchemaProvider
\
--source-ordering-field tpep_dropoff_datetime \
@@ -91,7 +91,7 @@ Now, we can schedule indexing using `HoodieIndexer` in
`schedule` mode as follow
```
spark-submit \
--class org.apache.hudi.utilities.HoodieIndexer \
-/Users/home/path/to/hudi-utilities-bundle/target/hudi-utilities-bundle_2.11-0.12.0-SNAPSHOT.jar
\
+/Users/home/path/to/hudi-utilities-bundle/target/hudi-utilities-bundle_2.11-0.12.1-SNAPSHOT.jar
\
--props /Users/home/path/to/indexer.properties \
--mode schedule \
--base-path /tmp/hudi-ny-taxi \
@@ -110,7 +110,7 @@ To execute indexing, run the indexer in `execute` mode as
below.
```
spark-submit \
--class org.apache.hudi.utilities.HoodieIndexer \
-/Users/home/path/to/hudi-utilities-bundle/target/hudi-utilities-bundle_2.11-0.12.0-SNAPSHOT.jar
\
+/Users/home/path/to/hudi-utilities-bundle/target/hudi-utilities-bundle_2.11-0.12.1-SNAPSHOT.jar
\
--props /Users/home/path/to/indexer.properties \
--mode execute \
--base-path /tmp/hudi-ny-taxi \
@@ -165,7 +165,7 @@ To drop an index, just run the index in `dropindex` mode.
```
spark-submit \
--class org.apache.hudi.utilities.HoodieIndexer \
-/Users/home/path/to/hudi-utilities-bundle/target/hudi-utilities-bundle_2.11-0.12.0-SNAPSHOT.jar
\
+/Users/home/path/to/hudi-utilities-bundle/target/hudi-utilities-bundle_2.11-0.12.1-SNAPSHOT.jar
\
--props /Users/home/path/to/indexer.properties \
--mode dropindex \
--base-path /tmp/hudi-ny-taxi \
diff --git a/website/versioned_docs/version-0.12.1/query_engine_setup.md
b/website/versioned_docs/version-0.12.1/query_engine_setup.md
index a581337d0d..91cd6c8879 100644
--- a/website/versioned_docs/version-0.12.1/query_engine_setup.md
+++ b/website/versioned_docs/version-0.12.1/query_engine_setup.md
@@ -92,7 +92,7 @@ to `org.apache.hadoop.hive.ql.io.HiveInputFormat`. Then
proceed to query the tab
## Redshift Spectrum
-Copy on Write Tables in Apache Hudi versions 0.5.2, 0.6.0, 0.7.0, 0.8.0,
0.9.0, 0.10.x, 0.11.x and 0.12.0 can be queried via Amazon Redshift Spectrum
external tables.
+Copy on Write Tables in Apache Hudi versions 0.5.2, 0.6.0, 0.7.0, 0.8.0,
0.9.0, 0.10.x, 0.11.x and 0.12.x can be queried via Amazon Redshift Spectrum
external tables.
To be able to query Hudi versions 0.10.0 and above please try latest versions
of Redshift.
:::note
Hudi tables are supported only when AWS Glue Data Catalog is used. It's not
supported when you use an Apache Hive metastore as the external catalog.
diff --git a/website/versioned_docs/version-0.12.1/quick-start-guide.md
b/website/versioned_docs/version-0.12.1/quick-start-guide.md
index 2610a88f30..24df38e9aa 100644
--- a/website/versioned_docs/version-0.12.1/quick-start-guide.md
+++ b/website/versioned_docs/version-0.12.1/quick-start-guide.md
@@ -50,7 +50,7 @@ From the extracted directory run spark-shell with Hudi:
```shell
# Spark 3.3
spark-shell \
- --packages org.apache.hudi:hudi-spark3.3-bundle_2.12:0.12.0 \
+ --packages org.apache.hudi:hudi-spark3.3-bundle_2.12:0.12.1 \
--conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' \
--conf
'spark.sql.catalog.spark_catalog=org.apache.spark.sql.hudi.catalog.HoodieCatalog'
\
--conf
'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension'
@@ -58,7 +58,7 @@ spark-shell \
```shell
# Spark 3.2
spark-shell \
- --packages org.apache.hudi:hudi-spark3.2-bundle_2.12:0.12.0 \
+ --packages org.apache.hudi:hudi-spark3.2-bundle_2.12:0.12.1 \
--conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' \
--conf
'spark.sql.catalog.spark_catalog=org.apache.spark.sql.hudi.catalog.HoodieCatalog'
\
--conf
'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension'
@@ -66,14 +66,14 @@ spark-shell \
```shell
# Spark 3.1
spark-shell \
- --packages org.apache.hudi:hudi-spark3.1-bundle_2.12:0.12.0 \
+ --packages org.apache.hudi:hudi-spark3.1-bundle_2.12:0.12.1 \
--conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' \
--conf
'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension'
```
```shell
# Spark 2.4
spark-shell \
- --packages org.apache.hudi:hudi-spark2.4-bundle_2.11:0.12.0 \
+ --packages org.apache.hudi:hudi-spark2.4-bundle_2.11:0.12.1 \
--conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' \
--conf
'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension'
```
@@ -87,7 +87,7 @@ From the extracted directory run pyspark with Hudi:
# Spark 3.3
export PYSPARK_PYTHON=$(which python3)
pyspark \
---packages org.apache.hudi:hudi-spark3.3-bundle_2.12:0.12.0 \
+--packages org.apache.hudi:hudi-spark3.3-bundle_2.12:0.12.1 \
--conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' \
--conf
'spark.sql.catalog.spark_catalog=org.apache.spark.sql.hudi.catalog.HoodieCatalog'
\
--conf
'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension'
@@ -96,7 +96,7 @@ pyspark \
# Spark 3.2
export PYSPARK_PYTHON=$(which python3)
pyspark \
---packages org.apache.hudi:hudi-spark3.2-bundle_2.12:0.12.0 \
+--packages org.apache.hudi:hudi-spark3.2-bundle_2.12:0.12.1 \
--conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' \
--conf
'spark.sql.catalog.spark_catalog=org.apache.spark.sql.hudi.catalog.HoodieCatalog'
\
--conf
'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension'
@@ -105,7 +105,7 @@ pyspark \
# Spark 3.1
export PYSPARK_PYTHON=$(which python3)
pyspark \
---packages org.apache.hudi:hudi-spark3.1-bundle_2.12:0.12.0 \
+--packages org.apache.hudi:hudi-spark3.1-bundle_2.12:0.12.1 \
--conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' \
--conf
'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension'
```
@@ -113,7 +113,7 @@ pyspark \
# Spark 2.4
export PYSPARK_PYTHON=$(which python3)
pyspark \
---packages org.apache.hudi:hudi-spark2.4-bundle_2.11:0.12.0 \
+--packages org.apache.hudi:hudi-spark2.4-bundle_2.11:0.12.1 \
--conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' \
--conf
'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension'
```
@@ -126,27 +126,27 @@ From the extracted directory run Spark SQL with Hudi:
```shell
# Spark 3.3
-spark-sql --packages org.apache.hudi:hudi-spark3.3-bundle_2.12:0.12.0 \
+spark-sql --packages org.apache.hudi:hudi-spark3.3-bundle_2.12:0.12.1 \
--conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' \
--conf
'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension' \
--conf
'spark.sql.catalog.spark_catalog=org.apache.spark.sql.hudi.catalog.HoodieCatalog'
```
```shell
# Spark 3.2
-spark-sql --packages org.apache.hudi:hudi-spark3.2-bundle_2.12:0.12.0 \
+spark-sql --packages org.apache.hudi:hudi-spark3.2-bundle_2.12:0.12.1 \
--conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' \
--conf
'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension' \
--conf
'spark.sql.catalog.spark_catalog=org.apache.spark.sql.hudi.catalog.HoodieCatalog'
```
```shell
# Spark 3.1
-spark-sql --packages org.apache.hudi:hudi-spark3.1-bundle_2.12:0.12.0 \
+spark-sql --packages org.apache.hudi:hudi-spark3.1-bundle_2.12:0.12.1 \
--conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' \
--conf
'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension'
```
```shell
# Spark 2.4
-spark-sql --packages org.apache.hudi:hudi-spark2.4-bundle_2.11:0.12.0 \
+spark-sql --packages org.apache.hudi:hudi-spark2.4-bundle_2.11:0.12.1 \
--conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' \
--conf
'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension'
```
@@ -1430,7 +1430,7 @@ more details please refer to [procedures](procedures).
You can also do the quickstart by [building hudi
yourself](https://github.com/apache/hudi#building-apache-hudi-from-source),
and using `--jars <path to
hudi_code>/packaging/hudi-spark-bundle/target/hudi-spark3.2-bundle_2.1?-*.*.*-SNAPSHOT.jar`
in the spark-shell command above
-instead of `--packages org.apache.hudi:hudi-spark3.2-bundle_2.12:0.12.0`. Hudi
also supports scala 2.12. Refer [build with scala
2.12](https://github.com/apache/hudi#build-with-different-spark-versions)
+instead of `--packages org.apache.hudi:hudi-spark3.2-bundle_2.12:0.12.1`. Hudi
also supports scala 2.12. Refer [build with scala
2.12](https://github.com/apache/hudi#build-with-different-spark-versions)
for more info.
Also, we used Spark here to show case the capabilities of Hudi. However, Hudi
can support multiple table types/query types and