This is an automated email from the ASF dual-hosted git repository.
lzljs3620320 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-table-store.git
The following commit(s) were added to refs/heads/master by this push:
new 0ffa6654 Fix spark jar name in docs for table store
0ffa6654 is described below
commit 0ffa6654b2d64fc65c430e453e656fa68ce74632
Author: Kerwin <[email protected]>
AuthorDate: Fri Mar 10 12:37:38 2023 +0800
Fix spark jar name in docs for table store
This closes #588
---
docs/content/docs/engines/spark2.md | 8 ++++----
docs/content/docs/engines/spark3.md | 16 +++++++++++-----
2 files changed, 15 insertions(+), 9 deletions(-)
diff --git a/docs/content/docs/engines/spark2.md
b/docs/content/docs/engines/spark2.md
index 318fbbbd..b6decd64 100644
--- a/docs/content/docs/engines/spark2.md
+++ b/docs/content/docs/engines/spark2.md
@@ -36,7 +36,7 @@ Table Store supports Spark 2.4+. It is highly recommended to
use Spark 2.4+ vers
{{< stable >}}
-Download [flink-table-store-spark2-{{< version
>}}.jar](https://www.apache.org/dyn/closer.lua/flink/flink-table-store-{{<
version >}}/flink-table-store-spark2-{{< version >}}.jar).
+Download [flink-table-store-spark-2-{{< version
>}}.jar](https://www.apache.org/dyn/closer.lua/flink/flink-table-store-{{<
version >}}/flink-table-store-spark-2-{{< version >}}.jar).
You can also manually build bundled jar from the source code.
@@ -56,7 +56,7 @@ Build bundled jar with the following command.
mvn clean install -DskipTests
```
-You can find the bundled jar in
`./flink-table-store-spark/flink-table-store-spark2/target/flink-table-store-spark2-{{<
version >}}.jar`.
+You can find the bundled jar in
`./flink-table-store-spark/flink-table-store-spark-2/target/flink-table-store-spark-2-{{<
version >}}.jar`.
## Quick Start
@@ -77,10 +77,10 @@ After the guide, all table files should be stored under the
path `/tmp/table_sto
You can append path to table store jar file to the `--jars` argument when
starting `spark-shell`.
```bash
-spark-shell ... --jars /path/to/flink-table-store-spark2-{{< version >}}.jar
+spark-shell ... --jars /path/to/flink-table-store-spark-2-{{< version >}}.jar
```
-Alternatively, you can copy `flink-table-store-spark2-{{< version >}}.jar`
under `spark/jars` in your Spark installation directory.
+Alternatively, you can copy `flink-table-store-spark-2-{{< version >}}.jar`
under `spark/jars` in your Spark installation directory.
**Step 3: Query Table**
diff --git a/docs/content/docs/engines/spark3.md
b/docs/content/docs/engines/spark3.md
index a44263bd..dc03da95 100644
--- a/docs/content/docs/engines/spark3.md
+++ b/docs/content/docs/engines/spark3.md
@@ -30,11 +30,17 @@ This documentation is a guide for using Table Store in
Spark3.
## Preparing Table Store Jar File
+Table Store currently supports Spark 3.3, 3.2 and 3.1. We recommend the latest
Spark version for a better experience.
+
{{< stable >}}
-Table Store currently supports Spark 3.3, 3.2 and 3.1. We recommend the latest
Spark version for a better experience.
+Download the jar file with corresponding version.
-Download [flink-table-store-spark-{{< version
>}}.jar](https://www.apache.org/dyn/closer.lua/flink/flink-table-store-{{<
version >}}/flink-table-store-spark-{{< version >}}.jar).
+| Version | Jar
|
+|-----------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
+| Spark 3.3 | [flink-table-store-spark-3.3-{{< version
>}}.jar](https://www.apache.org/dyn/closer.lua/flink/flink-table-store-{{<
version >}}/flink-table-store-spark-3.3-{{< version >}}.jar) |
+| Spark 3.2 | [flink-table-store-spark-3.2-{{< version
>}}.jar](https://www.apache.org/dyn/closer.lua/flink/flink-table-store-{{<
version >}}/flink-table-store-spark-3.2-{{< version >}}.jar) |
+| Spark 3.1 | [flink-table-store-spark-3.1-{{< version
>}}.jar](https://www.apache.org/dyn/closer.lua/flink/flink-table-store-{{<
version >}}/flink-table-store-spark-3.1-{{< version >}}.jar) |
You can also manually build bundled jar from the source code.
@@ -69,10 +75,10 @@ If you are using HDFS, make sure that the environment
variable `HADOOP_HOME` or
Append path to table store jar file to the `--jars` argument when starting
`spark-sql`.
```bash
-spark-sql ... --jars /path/to/flink-table-store-spark-{{< version >}}.jar
+spark-sql ... --jars /path/to/flink-table-store-spark-3.3-{{< version >}}.jar
```
-Alternatively, you can copy `flink-table-store-spark-{{< version >}}.jar`
under `spark/jars` in your Spark installation directory.
+Alternatively, you can copy `flink-table-store-spark-3.3-{{< version >}}.jar`
under `spark/jars` in your Spark installation directory.
**Step 2: Specify Table Store Catalog**
@@ -132,7 +138,7 @@ SELECT * FROM my_table;
If you don't want to use Table Store catalog, you can also run `spark-shell`
and query the table with Scala API.
```bash
-spark-shell ... --jars /path/to/flink-table-store-spark-{{< version >}}.jar
+spark-shell ... --jars /path/to/flink-table-store-spark-3.3-{{< version >}}.jar
```
```scala