This is an automated email from the ASF dual-hosted git repository.

benjobs pushed a commit to branch dev
in repository 
https://gitbox.apache.org/repos/asf/incubator-streampark-website.git


The following commit(s) were added to refs/heads/dev by this push:
     new 4d839f76 [Improve] Streamx change to StreamPark
4d839f76 is described below

commit 4d839f765545e89ae2d761f87b46e0c8cce85984
Author: benjobs <[email protected]>
AuthorDate: Tue Sep 12 08:57:43 2023 +0800

    [Improve] Streamx change to StreamPark
---
 docs/connector/6-hbase.md                                              | 2 +-
 docs/user-guide/1-deployment.md                                        | 2 +-
 i18n/zh-CN/docusaurus-plugin-content-docs/current/connector/6-hbase.md | 2 +-
 i18n/zh-CN/docusaurus-plugin-content-docs/current/development/model.md | 2 +-
 4 files changed, 4 insertions(+), 4 deletions(-)

diff --git a/docs/connector/6-hbase.md b/docs/connector/6-hbase.md
index df915404..eeecb3bf 100755
--- a/docs/connector/6-hbase.md
+++ b/docs/connector/6-hbase.md
@@ -430,7 +430,7 @@ class HBaseRequest[T: TypeInformation](@(transient@param) 
private val stream: Da
 
 }
 ```
-Stramx supports two ways to write data: 1. addSink() 2. writeUsingOutputFormat 
Examples are as follows:
+StreamPark supports two ways to write data: 1. addSink() 2. 
writeUsingOutputFormat Examples are as follows:
 ```scala
     //1)Insert way 1
     HBaseSink().sink[TestEntity](source, "order")
diff --git a/docs/user-guide/1-deployment.md b/docs/user-guide/1-deployment.md
index 6002b541..191f9b2e 100755
--- a/docs/user-guide/1-deployment.md
+++ b/docs/user-guide/1-deployment.md
@@ -44,7 +44,7 @@ Using `Flink on Kubernetes` requires additional deployment/or 
use of an existing
 
 ## Build & Deploy
 
-You can directly download the compiled [**release 
package**](https://github.com/apache/streampark/releases) (recommended), or you 
can choose to manually compile and install. The manual compilation and 
installation steps are as follows:
+You can directly download the compiled [**release 
package**](https://github.com/apache/incubator-streampark/releases) 
(recommended), or you can choose to manually compile and install. The manual 
compilation and installation steps are as follows:
 
 
 ### Environmental requirements
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/connector/6-hbase.md 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/connector/6-hbase.md
index 99472ac7..cd4c9826 100755
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/connector/6-hbase.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/connector/6-hbase.md
@@ -423,7 +423,7 @@ class HBaseRequest[T: TypeInformation](@(transient@param) 
private val stream: Da
 
 }
 ```
-Stramx支持两种方式写入数据:1.addSink() 2. writeUsingOutputFormat 样例如下:
+StreamPark 支持两种方式写入数据:1.addSink() 2. writeUsingOutputFormat 样例如下:
 ```scala
     //1)插入方式1
     HBaseSink().sink[TestEntity](source, "order")
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/development/model.md 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/development/model.md
index 2f024102..cc04a961 100755
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/development/model.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/development/model.md
@@ -528,7 +528,7 @@ StreamTableContext context = new 
StreamTableContext(JavaConfig);
 :::
 
 ## 目录结构
-推荐的项目目录结构如下,具体可以参考[Streampark-flink-quickstart](https://github.com/apache/streampark-quickstart)
 里的目录结构和配置
+推荐的项目目录结构如下,具体可以参考[StreamPark 
Quickstart](https://github.com/apache/incubator-streampark-quickstart) 里的目录结构和配置
 
 ``` tree
 .

Reply via email to