This is an automated email from the ASF dual-hosted git repository.

benjobs pushed a commit to branch dev
in repository 
https://gitbox.apache.org/repos/asf/incubator-streampark-website.git


The following commit(s) were added to refs/heads/dev by this push:
     new 394d5c6  docs improvement
394d5c6 is described below

commit 394d5c605c576b8f568a7e1e0c35373b7834c817
Author: benjobs <[email protected]>
AuthorDate: Wed Oct 4 12:46:59 2023 +0800

    docs improvement
---
 ...224\237\344\272\247\345\256\236\350\267\265.md" | 131 ++++-----------------
 ...224\237\344\272\247\345\256\236\350\267\265.md" |   6 +
 ...277\220\347\273\264\345\256\236\350\267\265.md" |   6 +
 docs/components/TableData.jsx                      |  49 +++++++-
 docs/components/data.js                            |  27 +++--
 docs/user-guide/1-deployment.md                    |  10 +-
 ...DevelopmentAndDebugging.md => 3-development.md} |  21 ++--
 .../current/components/TableData.jsx               |  48 +++++++-
 .../current/components/data.js                     |  26 ++--
 .../current/user-guide/1-deployment.md             |  10 +-
 ...DevelopmentAndDebugging.md => 3-development.md} |  23 ++--
 package.json                                       |  18 +--
 static/doc/image/streampark_build_success.png      | Bin 61813 -> 0 bytes
 13 files changed, 194 insertions(+), 181 deletions(-)

diff --git "a/blog/StreamPark \345\234\250 Joyme 
\347\232\204\347\224\237\344\272\247\345\256\236\350\267\265.md" 
"b/blog/StreamPark \345\234\250 Joyme 
\347\232\204\347\224\237\344\272\247\345\256\236\350\267\265.md"
index 310c832..5d14350 100644
--- "a/blog/StreamPark \345\234\250 Joyme 
\347\232\204\347\224\237\344\272\247\345\256\236\350\267\265.md"      
+++ "b/blog/StreamPark \345\234\250 Joyme 
\347\232\204\347\224\237\344\272\247\345\256\236\350\267\265.md"      
@@ -1,18 +1,22 @@
-# StreamPark 在 Joyme 的生产实践!
+---
+slug: streampark-usercase-joyme
+title: StreamPark 在 Joyme 的生产实践
+tags: [StreamPark, 生产实践, FlinkSQL]
+---
 
-**摘要:**本文带来 StreamPark 在 Joyme 中的生产实践, 作者是 Joyme 的大数据工程师秦基勇, 主要内容为:
+<br/>
 
-1. 遇见StreamPark
-2. Flink Sql 作业开发
-3. Custom code 作业开发
-4. 监控告警
-5. 常见问题
-6. 社区印象
-7. 总结
+**摘要:** 本文带来 StreamPark 在 Joyme 中的生产实践, 作者是 Joyme 的大数据工程师秦基勇, 主要内容为:
 
+- 遇见StreamPark
+- Flink Sql 作业开发
+- Custom code 作业开发
+- 监控告警
+- 常见问题
+- 社区印象
+- 总结
 
-
-## 01 遇见 StreamPark
+## 1 遇见 StreamPark
 
 遇见 StreamPark 是必然的,基于我们现有的实时作业开发模式,不得不寻找一个开源的平台来支撑我司的实时业务。我们的现状如下:
 
@@ -25,7 +29,7 @@
 
 第一次遇见 StreamPark 就基本确定了,我们根据官网的文档快速进行了部署安装,搭建以后进行了一些操作,界面友好,Flink 
多版本支持,权限管理,作业监控等一系列功能已能较好的满足我们的需求,进一步了解到其社区也很活跃,从 1.1.0 版本开始见证了 StreamPark 
功能完善的过程,开发团队是非常有追求的,相信会不断的完善。
 
-## 02 Flink SQL 作业开发
+## 2 Flink SQL 作业开发
 
 Flink Sql 开发模式带来了很大的便利,对于一些简单的指标开发,只需要简单的 Sql 就可以完成,不需要写一行代码。Flink Sql 
方便了很多同学的开发工作,毕竟一些做仓库的同学在编写代码方面还是有些难度。
 
@@ -33,7 +37,7 @@ Flink Sql 开发模式带来了很大的便利,对于一些简单的指标开
 
 Flink Sql 部分,按照 Flink 官网的文档逐步编写逻辑 Sql 即可,对于我司来说,一般就三部分: 接入 Source ,中间逻辑处理,最后 
Sink。基本上 Source 都是消费 kafka 的数据,逻辑处理层会有关联 MySQL 去做维表查询,最后 Sink 部分大多是 
Es,Redis,MySQL。
 
-#### **1. 编写SQL**
+### **1. 编写SQL**
 
 ```sql
 -- 连接kafka
@@ -96,7 +100,7 @@ SELECT  Data.uid  FROM source_table;
 
 ![](/blog/Joyme/application_job.png)
 
-## 03 Custom code 作业开发
+## 3 Custom Code 作业开发
 
 Streaming 作业我们是使用 Flink java 进行开发,将之前 Spark scala,Flink scala,Flink java 
的作业进行了重构,然后工程整合到了一起,目的就是为了维护起来方便。Custom code 作业需要提交代码到 Git,然后配置项目:
 
@@ -114,7 +118,7 @@ Streaming 作业我们是使用 Flink java 进行开发,将之前 Spark scala
 
 ![](/blog/Joyme/application_interface.png)
 
-## 04 监控告警
+## 4 监控告警
 
 StreamPark 的监控需要在 setting 模块去配置发送邮件的基本信息。
 
@@ -130,96 +134,7 @@ StreamPark 的监控需要在 setting 模块去配置发送邮件的基本信息
 
 关于报警这一块目前我们基于 StreamPark 的 t_flink_app 
表进行了一个定时任务的开发。为什么要这么做?因为发送邮件这种通知,大部分人可能不会去及时去看。所以我们选择监控每个任务的状态去把对应的监控信息发送我们的飞书报警群,这样可以及时发现问题去解决任务。一个简单的
 python 脚本,然后配置了 crontab 去定时执行。
 
-```python
-
-import MySQLdb
-import json
-import requests
-
-
-def connect_mysql():
-    db = MySQLdb.connect("mysqlhost", "database", "password", "dstream", 
charset='utf8')
-
-    cursor = db.cursor()
-
-    cursor.execute("select STATE,JOB_NAME,ALERT_EMAIL  from t_flink_app where 
state in (-9,15,8,9)")
-
-    data = cursor.fetchall()
-
-    db.close()
-    return data
-
-
-def alert( data):
-    for row in data:
-      send(row)
-
-
-def send(row):
-    webhook = '飞书机器人的hook地址'
-    payload_mes = {undefined
-        "msg_type": "text",
-        "content": {undefined
-            "text": '作业:' + row[1].encode('utf-8') + ',状态:' + 
row[0].encode('utf-8')
-        }
-    }
-
-    state='KILLED'
-
-    if int(row[0].encode('utf-8')) == 8:
-        state='FAILING'
-    elif int(row[0].encode('utf-8')) == 9:
-        state='FAILED'
-    elif int(row[0].encode('utf-8')) == 15:
-        state='LOST'
-    else:
-        state='KILLED'
-
-   email=row[2]
-
-    if email is None:
-        email='-'
-
-    payload_mes = {undefined
-        "msg_type": "post",
-        "content": {undefined
-            "post": {undefined
-                "zh_cn": {undefined
-                    "title": "flink实时作业告警",
-                    "content": [
-                        [{undefined
-                            "tag": "text",
-                            "text": "作业名称: " + row[1].encode('utf-8')
-                        }],
-                           [ {undefined
-                                "tag": "text",
-                                "text": "作业状态: " + state
-                            }],
-                            [{undefined
-                                "tag": "text",
-                                "text": "报警邮件: " + email.encode('utf-8')
-                            }]
-                    ]
-                }
-            }
-        }
-    }
-    headers = {undefined
-        'Content-Type': 'application/json'
-    }
-
-    res = requests.request('POST', webhook, headers=headers, 
data=json.dumps(payload_mes))
-
-    print(res.text)
-
-if __name__ == '__main__':
-    data = connect_mysql()
-    alert(data)
-    print(data)
-
-```
-
-## 05 常见问题
+## 5 常见问题
 
 关于作业的异常问题,我们归纳分析了基本分为这么几种情况:
 
@@ -235,12 +150,12 @@ if __name__ == '__main__':
 
 ![](/blog/Joyme/yarn_log.png)
 
-## 06 社区印象
+## 6 社区印象
 
 很多时候我们在 StreamPark 用户群里讨论问题,都会得到社区小伙伴的即时响应。提交的一些 issue 
在当下不能解决的,基本也会在下一个版本或者最新的代码分支中进行修复。在群里,我们也看到很多不是社区的小伙伴,也在积极互相帮助去解决问题。群里也有很多其他社区的大佬,很多小伙伴也积极加入了社区的开发工作。整个社区给我的感觉还是很活跃!
 
-## 07 总结
+## 7 总结
 
-目前我司线上运行 60个实时作业,Flink sql 与 Custom-code 差不多各一半。后续也会有更多的实时任务进行上线。很多同学都会担心 
StreamPark 稳不稳定的问题,就我司根据几个月的生产实践而言,StreamPark 
只是一个帮助你开发作业,部署,监控和管理的一个平台。到底稳不稳,还是要看自家的 Hadoop yarn 集群稳不稳定(我们用的onyan模式),其实已经跟 
StreamPark关系不大了。还有就是你写的 Flink Sql 或者是代码健不健壮。更多的是这两方面应该是大家要考虑的,这两方面没问题再充分利用 
StreamPark 的灵活性才能让作业更好的运行,单从一方面说 StreamPark 稳不稳定,实属偏激。
+目前我司线上运行 60 个实时作业,Flink sql 与 Custom-code 差不多各一半。后续也会有更多的实时任务进行上线。很多同学都会担心 
StreamPark 稳不稳定的问题,就我司根据几个月的生产实践而言,StreamPark 
只是一个帮助你开发作业,部署,监控和管理的一个平台。到底稳不稳,还是要看自家的 Hadoop yarn 集群稳不稳定(我们用的onyan模式),其实已经跟 
StreamPark关系不大了。还有就是你写的 Flink Sql 或者是代码健不健壮。更多的是这两方面应该是大家要考虑的,这两方面没问题再充分利用 
StreamPark 的灵活性才能让作业更好的运行,单从一方面说 StreamPark 稳不稳定,实属偏激。
 
 以上就是 StreamPark 在乐我无限的全部分享内容,感谢大家看到这里。非常感谢 StreamPark 
提供给我们这么优秀的产品,这就是做的利他人之事。从1.0 到 1.2.1 平时遇到那些bug都会被即时的修复,每一个issue都被认真对待。目前我们还是 
onyarn的部署模式,重启yarn还是会导致作业的lost状态,重启yarn也不是天天都干的事,关于这个社区也会尽早的会去修复这个问题。相信 
StreamPark 会越来越好,未来可期。
\ No newline at end of file
diff --git "a/blog/StreamPark 
\345\234\250\351\241\272\347\275\221\347\247\221\346\212\200\347\232\204\345\244\247\350\247\204\346\250\241\347\224\237\344\272\247\345\256\236\350\267\265.md"
 "b/blog/StreamPark 
\345\234\250\351\241\272\347\275\221\347\247\221\346\212\200\347\232\204\345\244\247\350\247\204\346\250\241\347\224\237\344\272\247\345\256\236\350\267\265.md"
index e2832fc..4d9d330 100644
--- "a/blog/StreamPark 
\345\234\250\351\241\272\347\275\221\347\247\221\346\212\200\347\232\204\345\244\247\350\247\204\346\250\241\347\224\237\344\272\247\345\256\236\350\267\265.md"
 
+++ "b/blog/StreamPark 
\345\234\250\351\241\272\347\275\221\347\247\221\346\212\200\347\232\204\345\244\247\350\247\204\346\250\241\347\224\237\344\272\247\345\256\236\350\267\265.md"
 
@@ -1,3 +1,9 @@
+---
+slug: streampark-usercase-shunwang
+title: StreamPark 在顺网科技的大规模生产实践
+tags: [StreamPark, 生产实践, FlinkSQL]
+---
+
 # StreamPark 在顺网科技的大规模生产实践
 
 ![](/blog/SF/autor.png)
diff --git "a/blog/\350\201\224\351\200\232 Flink 
\345\256\236\346\227\266\350\256\241\347\256\227\345\271\263\345\217\260\345\214\226\350\277\220\347\273\264\345\256\236\350\267\265.md"
 "b/blog/\350\201\224\351\200\232 Flink 
\345\256\236\346\227\266\350\256\241\347\256\227\345\271\263\345\217\260\345\214\226\350\277\220\347\273\264\345\256\236\350\267\265.md"
index 0ad770c..c4056d0 100644
--- "a/blog/\350\201\224\351\200\232 Flink 
\345\256\236\346\227\266\350\256\241\347\256\227\345\271\263\345\217\260\345\214\226\350\277\220\347\273\264\345\256\236\350\267\265.md"
     
+++ "b/blog/\350\201\224\351\200\232 Flink 
\345\256\236\346\227\266\350\256\241\347\256\227\345\271\263\345\217\260\345\214\226\350\277\220\347\273\264\345\256\236\350\267\265.md"
     
@@ -1,3 +1,9 @@
+---
+slug: streampark-usercase-chinaunion
+title: 联通 Flink 实时计算平台化运维实践
+tags: [StreamPark, 生产实践, FlinkSQL]
+---
+
 # 联通 Flink 实时计算平台化运维实践
 
 **摘要:**本文整理自联通数科实时计算团队负责人、Apache StreamPark Committer 穆纯进在 Flink Forward Asia 
2022 平台建设专场的分享,本篇内容主要分为四个部分:
diff --git a/docs/components/TableData.jsx b/docs/components/TableData.jsx
index 968be01..0c47035 100644
--- a/docs/components/TableData.jsx
+++ b/docs/components/TableData.jsx
@@ -326,7 +326,49 @@ const ClientTables = () => {
 };
 
 
-const ClientEnvs = () => {
+
+const DeploymentEnvs = () => {
+    return (
+
+        <div>
+            <table className="table-data" style={{width: '100%', display: 
'inline-table'}}>
+                <thead>
+                <tr>
+                    <td>Item</td>
+                    <td>Version</td>
+                    <td>Required</td>
+                    <td>Other</td>
+                </tr>
+                </thead>
+                <tbody>
+                {
+                    dataSource.deploymentEnvs.map((item, i) => (
+                        <tr key={i}>
+                            <td>
+                                <span className="label-info">{item.name}</span>
+                            </td>
+                            <td>{item.version}</td>
+                            <td>
+                                {
+                                    item.required
+                                        ?
+                                        <span className="icon-toggle-on" 
title="Required"></span>
+                                        :
+                                        <span className="icon-toggle-off" 
title="Optional"></span>
+                                }
+                            </td>
+                            <td>{item.other}</td>
+                        </tr>
+                    ))
+                }
+                </tbody>
+            </table>
+        </div>
+
+    );
+};
+
+const DevelopmentEnvs = () => {
     return (
 
         <div>
@@ -341,7 +383,7 @@ const ClientEnvs = () => {
                 </thead>
                 <tbody>
                 {
-                    dataSource.envs.map((item, i) => (
+                    dataSource.developmentEnvs.map((item, i) => (
                         <tr key={i}>
                             <td>
                                 <span className="label-info">{item.name}</span>
@@ -378,5 +420,6 @@ export {
     ClientFixedDelay,
     ClientFailureRate,
     ClientTables,
-    ClientEnvs
+    DevelopmentEnvs,
+    DeploymentEnvs
 };
diff --git a/docs/components/data.js b/docs/components/data.js
index 1caac99..eff55c8 100644
--- a/docs/components/data.js
+++ b/docs/components/data.js
@@ -92,13 +92,24 @@ export default {
         {name: 'catalog', desc: 'Catalog,Specifies that the will be used 
during initialization', value: ''},
         {name: 'database', desc: 'Database,Specifies that the will be used 
during initialization', value: ''},
     ],
-    envs: [
-        {name: 'Operating System', version: 'Linux', required: true, other: 
'UnSupport Windows'},
+
+    deploymentEnvs: [
+        {name: 'OS', version: 'Linux', required: true, other: 'UnSupport 
Windows'},
         {name: 'JAVA', version: '1.8+', required: true, other: null},
-        {name: 'Maven', version: '3+', required: false, other: 'Optionally 
install Maven'},
-        {name: 'Node.js', version: '', required: true, other: 'Node 
environment'},
-        {name: 'Flink', version: '1.12.0+', required: true, other: 'The 
version must be 1.12+'},
-        {name: 'Hadoop', version: '2+', required: false, other: 'Optional, If 
on yarn, hadoop environment is required.'},
-        {name: 'MySQL', version: '5.6+', required: false, other: 'Optionally 
install MySQL'}
-    ]
+        {name: 'MySQL', version: '5.6+', required: true, other: null},
+        {name: 'Flink', version: '1.12.0+', required: true, other: 'Flink 
version >= 1.12'},
+        {name: 'Hadoop', version: '2+', required: false, other: 'Optional, If 
on yarn, hadoop envs is required.'},
+    ],
+
+    developmentEnvs:  [
+        {name: 'OS', version: 'Linux', required: false, other: 'Supports 
Windows, recommended to use Mac/Linux.'},
+        {name: 'IDE', version: 'Intellij IDEA', required: false, other: 
'Recommended to use Intellij IDEA'},
+        {name: 'JAVA', version: '1.8 +', required: true, other: null},
+        {name: 'Scala', version: '2.12.x', required: true, other: null},
+        {name: 'Nodejs', version: '5.6 +', required: true, other: 'Node 
>=16.15.1 <= 18, https://nodejs.org'},
+        {name: 'pnpm', version: '7.11.2', required: true, other: 'npm install 
-g pnpm'},
+        {name: 'Flink', version: '1.12.0 +', required: true, other: 'Flink >= 
1.12, just download and unpack it.'},
+        {name: 'MySQL', version: '5.6 +', required: false, other: null},
+        {name: 'Hadoop', version: '2 +', required: false, other: 'Optional, If 
on yarn, hadoop envs is required.'},
+    ],
 }
diff --git a/docs/user-guide/1-deployment.md b/docs/user-guide/1-deployment.md
index 191f9b2..90b5766 100755
--- a/docs/user-guide/1-deployment.md
+++ b/docs/user-guide/1-deployment.md
@@ -4,7 +4,7 @@ title: 'Platform Deployment'
 sidebar_position: 1
 ---
 
-import { ClientEnvs } from '../components/TableData.jsx';
+import { DeploymentEnvs } from '../components/TableData.jsx';
 
 The overall component stack structure of StreamPark is as follows. It consists 
of two major parts: streampark-core and streampark-console. streampark-console 
is a very important module, positioned as a **integrated real-time data 
platform**, ** streaming data warehouse Platform**, **Low Code**, **Flink & 
Spark task hosting platform**, can better manage Flink tasks, integrate project 
compilation, publishing, parameter configuration, startup, savepoint, flame 
graph ( flame graph ), Flink S [...]
 
@@ -14,13 +14,7 @@ streampark-console provides an out-of-the-box installation 
package. Before insta
 
 ## Environmental requirements
 
-<ClientEnvs></ClientEnvs>
-
-:::tip Notice
-The versions before (including) StreamPark 1.2.2 only support `scala 2.11`. Do 
not check the corresponding `scala` version when using `flink`
-Versions after (including) 1.2.3, support both `scala 2.11` and `scala 2.12` 
versions
-:::
-
+<DeploymentEnvs></DeploymentEnvs>
 
 At present, StreamPark has released tasks for Flink, and supports both `Flink 
on YARN` and `Flink on Kubernetes` modes.
 
diff --git a/docs/user-guide/3-localDevelopmentAndDebugging.md 
b/docs/user-guide/3-development.md
similarity index 68%
rename from docs/user-guide/3-localDevelopmentAndDebugging.md
rename to docs/user-guide/3-development.md
index fa2fbaf..d397e39 100755
--- a/docs/user-guide/3-localDevelopmentAndDebugging.md
+++ b/docs/user-guide/3-development.md
@@ -1,17 +1,14 @@
 ---
-id: 'local development and debugging'
-title: 'Local Development and Debugging'
+id: 'development'
+title: 'Development Guide'
 sidebar_position: 3
 ---
 
 ### Environment Requirements
 
-- Maven 3.6+
-- nodejs (version >= 16.14)
-- npm 7.11.2 ( https://nodejs.org/en/ )
-- pnpm (npm install -g pnpm)
-- JDK 1.8+
-- Scala 2.12.x
+import { DevelopmentEnvs } from '../components/TableData.jsx';
+
+<DevelopmentEnvs></DevelopmentEnvs>
 
 ### Clone the Source Code
 
@@ -26,8 +23,6 @@ cd incubator-streampark/
 ./build.sh
 ```
 
-![Build Success](/doc/image/streampark_build_success.png)
-
 ### Open the Project
 
 Here, we are using `idea` to open the project.
@@ -40,12 +35,12 @@ open -a /Applications/IntelliJ\ IDEA\ CE.app/ ./
 
 ```bash
 cd ./dist
-tar -zxvf apache-streampark-2.2.0-SNAPSHOT-incubating-bin.tar.gz
+tar -zxvf apache-streampark-2.2.0-incubating-bin.tar.gz
 ```
 
 ### Copy the Path
 
-Copy the path of the extracted directory, for example: 
`/Users/user/IdeaProjects/incubator-streampark/dist/apache-streampark_2.12-2.2.0-SNAPSHOT-incubating-bin`
+Copy the path of the extracted directory, for example: 
`${workspace}/incubator-streampark/dist/apache-streampark-2.2.0-incubating-bin`
 
 ### Start the Backend Service
 
@@ -58,7 +53,7 @@ Modify the launch configuration
 Check `Add VM options`, and input the parameter `-Dapp.home=$path`, where 
`$path` is the path we just copied.
 
 ```bash
--Dapp.home=/Users/user/IdeaProjects/incubator-streampark/dist/apache-streampark_2.12-2.2.0-SNAPSHOT-incubating-bin
+-Dapp.home=${workspace}/incubator-streampark/dist/apache-streampark-2.2.0-incubating-bin
 ```
 
 ![Streampark Run Config](/doc/image/streampark_run_config.jpeg)
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/components/TableData.jsx 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/components/TableData.jsx
index 06a1037..0cc6f20 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/components/TableData.jsx
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/components/TableData.jsx
@@ -326,7 +326,7 @@ const ClientTables = () => {
 };
 
 
-const ClientEnvs = () => {
+const DeploymentEnvs = () => {
     return (
 
         <div>
@@ -341,7 +341,48 @@ const ClientEnvs = () => {
                 </thead>
                 <tbody>
                 {
-                    dataSource.envs.map((item, i) => (
+                    dataSource.deploymentEnvs.map((item, i) => (
+                        <tr key={i}>
+                            <td>
+                                <span className="label-info">{item.name}</span>
+                            </td>
+                            <td>{item.version}</td>
+                            <td>
+                                {
+                                    item.required
+                                        ?
+                                        <span className="icon-toggle-on" 
title="必须"></span>
+                                        :
+                                        <span className="icon-toggle-off" 
title="可选"></span>
+                                }
+                            </td>
+                            <td>{item.other}</td>
+                        </tr>
+                    ))
+                }
+                </tbody>
+            </table>
+        </div>
+
+    );
+};
+
+const DevelopmentEnvs = () => {
+    return (
+
+        <div>
+            <table className="table-data" style={{width: '100%', display: 
'inline-table'}}>
+                <thead>
+                <tr>
+                    <td>要求</td>
+                    <td>版本</td>
+                    <td>是否必须</td>
+                    <td>其他事项</td>
+                </tr>
+                </thead>
+                <tbody>
+                {
+                    dataSource.developmentEnvs.map((item, i) => (
                         <tr key={i}>
                             <td>
                                 <span className="label-info">{item.name}</span>
@@ -378,5 +419,6 @@ export {
     ClientFixedDelay,
     ClientFailureRate,
     ClientTables,
-    ClientEnvs
+    DevelopmentEnvs,
+    DeploymentEnvs
 };
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/components/data.js 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/components/data.js
index 944ac20..5bce52d 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/components/data.js
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/components/data.js
@@ -96,13 +96,25 @@ export default {
         {name: 'catalog', desc: '指定catalog,如指定初始化时会使用到', value: ''},
         {name: 'database', desc: '指定database,如指定初始化时会使用到', value: ''},
     ],
-    envs: [
+
+    deploymentEnvs: [
         {name: '操作系统', version: 'Linux', required: true, other: '不支持Window系统'},
         {name: 'JAVA', version: '1.8+', required: true, other: null},
-        {name: 'Maven', version: '3+', required: false, other: 
'部署机器可选安装Maven(项目编译会用到)'},
-        {name: 'Node.js', version: '', required: true, other: 'NodeJs相关环境'},
-        {name: 'Flink', version: '1.12.0+', required: true, other: 
'版本必须是1.12.x或以上版本,scala版本必须是2.11'},
-        {name: 'Hadoop', version: '2+', required: false, other: '可选,如果on 
yarn则需要hadoop环境,并且配置好相关环境变量'},
-        {name: 'MySQL', version: '5.6+', required: false, other: 
'部署机器或者其他机器安装MySQL'}
-    ]
+        {name: 'MySQL', version: '5.6+', required: true, other: 
'默认使用H2数据库,推荐使用 MySQL'},
+        {name: 'Flink', version: '1.12.0+', required: true, other: 'Flink 
version >= 1.12'},
+        {name: 'Hadoop', version: '2+', required: false, other: '可选,如果部署flink 
on yarn 则需要hadoop环境.'},
+    ],
+
+    developmentEnvs:  [
+        {name: '操作系统', version: 'Linux', required: false, other: '支持 Windows, 
推荐使用 Mac/Linux.'},
+        {name: 'IDE', version: 'Intellij IDEA', required: false, other: '推荐使用 
Intellij IDEA'},
+        {name: 'JAVA', version: '1.8 +', required: true, other: null},
+        {name: 'Scala', version: '2.12.x', required: true, other: null},
+        {name: 'Nodejs', version: '5.6 +', required: true, other: 'Node 
>=16.15.1 <= 18, https://nodejs.org'},
+        {name: 'pnpm', version: '7.11.2', required: true, other: 'npm install 
-g pnpm'},
+        {name: 'Flink', version: '1.12.0 +', required: true, other: 'Flink >= 
1.12, 只需要从Flink官网下载解绑即可'},
+        {name: 'MySQL', version: '5.6 +', required: false, other: null},
+        {name: 'Hadoop', version: '2 +', required: false, other: '可选,如果部署flink 
on yarn 需要配置hadoop环境变量.'},
+    ],
+
 }
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/user-guide/1-deployment.md 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/user-guide/1-deployment.md
index a9833c4..5856cd3 100755
--- 
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/user-guide/1-deployment.md
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/user-guide/1-deployment.md
@@ -4,7 +4,7 @@ title: '安装部署'
 sidebar_position: 1
 ---
 
-import { ClientEnvs } from '../components/TableData.jsx';
+import { DeploymentEnvs } from '../components/TableData.jsx';
 
 StreamPark 总体组件栈架构如下, 由 streampark-core 和 streampark-console 两个大的部分组成 , 
streampark-console 是一个非常重要的模块, 定位是一个**综合实时数据平台**,**流式数仓平台**, **低代码 ( Low Code 
)**, **Flink & Spark 任务托管平台**,可以较好的管理 Flink 任务,集成了项目编译、发布、参数配置、启动、savepoint,火焰图 
( flame graph ),Flink SQL,监控等诸多功能于一体,大大简化了 Flink 
任务的日常操作和维护,融合了诸多最佳实践。其最终目标是打造成一个实时数仓,流批一体的一站式大数据解决方案
 
@@ -14,13 +14,7 @@ streampark-console 提供了开箱即用的安装包,安装之前对环境有
 
 ## 环境要求
 
-<ClientEnvs></ClientEnvs>
-
-:::tip 注意
-StreamPark 1.2.2之前(包含)的版本,只支持`scala 2.11`,切忌使用`flink`时要检查对应的`scala`版本
-1.2.3之后(包含)的版本,支持 `scala 2.11` 和 `scala 2.12` 两个版本
-:::
-
+<DeploymentEnvs></DeploymentEnvs>
 
 目前 StreamPark 对 Flink 的任务发布,同时支持 `Flink on YARN` 和 `Flink on Kubernetes` 两种模式。
 
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/user-guide/3-localDevelopmentAndDebugging.md
 b/i18n/zh-CN/docusaurus-plugin-content-docs/current/user-guide/3-development.md
similarity index 61%
rename from 
i18n/zh-CN/docusaurus-plugin-content-docs/current/user-guide/3-localDevelopmentAndDebugging.md
rename to 
i18n/zh-CN/docusaurus-plugin-content-docs/current/user-guide/3-development.md
index 2799244..0694310 100755
--- 
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/user-guide/3-localDevelopmentAndDebugging.md
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/user-guide/3-development.md
@@ -1,17 +1,14 @@
 ---
-id: 'local development and debugging'
-title: '本地开发调试'
+id: 'development'
+title: '开发指南'
 sidebar_position: 3
 ---
 
 ### 环境要求
 
-- Maven 3.6+
-- nodejs (version >= 16.14)
-- npm 7.11.2 ( https://nodejs.org/en/ )
-- pnpm (npm install -g pnpm)
-- JDK 1.8+
-- Scala 2.12.x
+import { DevelopmentEnvs } from '../components/TableData.jsx';
+
+<DevelopmentEnvs></DevelopmentEnvs>
 
 ### clone源码
 
@@ -26,8 +23,6 @@ cd incubator-streampark/
 ./build.sh
 ```
 
-![Build Success](/doc/image/streampark_build_success.png)
-
 ### 打开项目
 
 此处使用`idea`打开项目
@@ -40,16 +35,16 @@ open -a /Applications/IntelliJ\ IDEA\ CE.app/ ./
 
 ```bash
 cd ./dist
-tar -zxvf apache-streampark-2.2.0-SNAPSHOT-incubating-bin.tar.gz
+tar -zxvf apache-streampark-2.2.0-incubating-bin.tar.gz
 ```
 
 ### 复制路径
 
-复制解压后的目录路径,例:`/Users/user/IdeaProjects/incubator-streampark/dist/apache-streampark_2.12-2.2.0-SNAPSHOT-incubating-bin`
+复制解压后的目录路径,例:`${workspace}/incubator-streampark/dist/apache-streampark-2.2.0-incubating-bin`
 
 ### 启动后台服务
 
-找到`streampark-console/streampark-console-service/src/main/java/org/apache/streampark/console/StreamParkConsoleBootstrap.java`
+找到 
`streampark-console/streampark-console-service/src/main/java/org/apache/streampark/console/StreamParkConsoleBootstrap.java`
 
 修改启动配置
 
@@ -58,7 +53,7 @@ tar -zxvf 
apache-streampark-2.2.0-SNAPSHOT-incubating-bin.tar.gz
 勾选`Add VM options`,填写参数`-Dapp.home=$path`,`$path`是我们刚刚复制的路径
 
 ```bash
--Dapp.home=/Users/user/IdeaProjects/incubator-streampark/dist/apache-streampark_2.12-2.2.0-SNAPSHOT-incubating-bin
+-Dapp.home=${workspace}/incubator-streampark/dist/apache-streampark-2.2.0-incubating-bin
 ```
 
 ![Streampark Run Config](/doc/image/streampark_run_config.jpeg)
diff --git a/package.json b/package.json
index 8cc2025..5128240 100644
--- a/package.json
+++ b/package.json
@@ -16,25 +16,25 @@
     "typecheck": "tsc"
   },
   "dependencies": {
-    "@docusaurus/core": "^2.1.0",
-    "@docusaurus/plugin-content-docs": "^2.1.0",
-    "@docusaurus/preset-classic": "^2.1.0",
-    "@easyops-cn/docusaurus-search-local": "^0.33.5",
+    "@docusaurus/core": "2.4.3",
+    "@docusaurus/plugin-content-docs": "^2.4.3",
+    "@docusaurus/preset-classic": "2.4.3",
+    "@easyops-cn/docusaurus-search-local": "^0.36.0",
     "@mdx-js/react": "^1.6.22",
     "@svgr/webpack": "^6.2.1",
     "aos": "^2.3.4",
-    "clsx": "^1.1.1",
+    "clsx": "^1.2.1",
     "file-loader": "^6.2.0",
     "prism-react-renderer": "^1.3.1",
-    "react": "^17.0.2",
+    "react": "^18.2.0",
     "react-copy-to-clipboard": "^5.1.0",
-    "react-dom": "^17.0.2",
-    "react-tsparticles": "^2.1.4",
+    "react-dom": "^18.2.0",
+    "react-tsparticles": "^2.12.2",
     "sass-loader": "^13.0.2",
     "url-loader": "^4.1.1"
   },
   "devDependencies": {
-    "@docusaurus/module-type-aliases": "^2.1.0",
+    "@docusaurus/module-type-aliases": "2.4.3",
     "@tsconfig/docusaurus": "^1.0.4",
     "docusaurus-plugin-less": "^2.0.2",
     "less": "^4.1.2",
diff --git a/static/doc/image/streampark_build_success.png 
b/static/doc/image/streampark_build_success.png
deleted file mode 100644
index 4b92759..0000000
Binary files a/static/doc/image/streampark_build_success.png and /dev/null 
differ

Reply via email to