This is an automated email from the ASF dual-hosted git repository.

benjobs pushed a commit to branch dev
in repository 
https://gitbox.apache.org/repos/asf/incubator-streampark-website.git


The following commit(s) were added to refs/heads/dev by this push:
     new ad281c3  [Improve] project doc improvement (#404)
ad281c3 is described below

commit ad281c37f758f35129de538615f1a4f84bb82d31
Author: benjobs <[email protected]>
AuthorDate: Wed Aug 14 23:37:09 2024 +0800

    [Improve] project doc improvement (#404)
    
    * [Improve] project doc improvement
    
    * [Improve] doc improvement
---
 docs/platform/6.project.md                                    | 11 +++++------
 .../current/platform/6.project.md                             |  4 ++++
 2 files changed, 9 insertions(+), 6 deletions(-)

diff --git a/docs/platform/6.project.md b/docs/platform/6.project.md
index 936406c..c0be6b5 100644
--- a/docs/platform/6.project.md
+++ b/docs/platform/6.project.md
@@ -5,17 +5,18 @@ sidebar_position: 6
 ---
 
 ## Feature
+
 The "Project Management" feature of StreamPark simplifies the management and 
deployment processes of Apache Flink and Apache Spark projects. By integrating 
with code repositories, users can automatically pull code, specify branches, 
and use Maven for automated Jar file builds. StreamPark provides CI/CD support, 
enabling automatic build and deployment when code updates occur. Additionally, 
users can submit jobs directly on the platform and monitor them in real-time, 
ensuring stable job ex [...]
 
 ## Project Management
 
-Apache Flink and Apache Spark offer programming models based on Java code for 
job development. For these types of jobs, users are required to write the 
program, build and package it to produce the target Jar, and then execute a 
command line (flink run or spark-submit) to submit the job. StreamPark offers 
project management capabilities, allowing users to easily add Apache 
Flink/Spark projects built using Java Maven into StreamPark for management. 
With StreamPark, the packaging and buildi [...]
+Apache Flink and Apache Spark offer programming models based on Java code for 
job development. For these types of jobs, users are required to write the 
program, build and package it to produce the target Jar, and then execute a 
command line (flink run or spark-submit) to submit the job. StreamPark offers 
project management capabilities, allowing users to easily add Apache 
Flink/Spark projects built by Java Maven, into StreamPark for management 
purpose. In this way, StreamPark will help p [...]
 
 ![Project](/doc/image/project/project.png)
 
 ## How to use
 
-Click "Project Management" and add a new project. You will be taken to the 
project add page. The operation is as follows:
+Click "Project Management" and add a new project. You will enter the project 
adding page. The operation is as follows:
 
 ![Hot to use](/doc/image/project/project.gif)
 
@@ -26,9 +27,7 @@ Click "Project Management" and add a new project. You will be 
taken to the proje
 - Repository URL:(Required) URL of the project repository, e.g: project's 
GitHub address or GitLab address
 - Password: (Optional) If the project requires a password to access
 - Branches: (Required) The project branch: Will be automatically loaded based 
on the Repository URL to select the optional branch.
-- Build Argument: (Optional) Project build parameters
-  The build parameters here are the standard parameters of Maven. For example, 
if you want to specify the profile as dev, then here is -Pdev, and other 
parameters are inferred from this.
-
+- Build Argument: (Optional) Project build parameters, The build parameters 
here are the standard parameters of Maven. For example, if you want to specify 
the profile as dev, then here is -Pdev, and other parameters can be inferred 
from this.
 - POM: (Optional) The location of the pom.xml file of the target module to be 
built. For example, if the [StreamPark 
Quickstart](https://github.com/apache/incubator-streampark-quickstart) project 
directory structure is as follows:
 
   ![Pom](/doc/image/project/pom-position.png)
@@ -37,5 +36,5 @@ Click "Project Management" and add a new project. You will be 
taken to the proje
 
 
 :::tip Remind
-In StreamPark, the project management system has a built-in Maven build 
capability, similar to Jenkins, but only supports projects built using Maven. 
It has performed security verification and interception of dangerous parameters 
on parameter input (Build Argument). If the user's project itself has some 
risks or vulnerabilities, the user needs to handle them by himself. Some risky 
operations, such as using the exec-maven-plugin plug-in in the user's project, 
whether these plug-ins will b [...]
+In StreamPark, the project management module has a built-in Maven build 
capability, similar to Jenkins, but only supports projects built by Maven. The 
security verification and interception of dangerous parameters on parameter 
input (Build Argument) will by performed by StreamPark. If the user's project 
itself has some risks or vulnerabilities, the user needs to handle them by 
himself. Some risky operations, such as using the exec-maven-plugin plug-in in 
the user's project, whether these [...]
 :::
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/platform/6.project.md 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/platform/6.project.md
index f5cc56f..daee062 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/platform/6.project.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/platform/6.project.md
@@ -6,6 +6,10 @@ sidebar_position: 6
 ## 功能
 StreamPark 的“项目管理”功能简化了 Apache Flink 和 Apache Spark 
项目的管理和部署流程。通过集成代码仓库,用户可以自动拉取代码、指定分支,并使用 Maven 自动化构建 Jar 文件。StreamPark 提供了 CI/CD 
支持,能够在代码更新时自动触发构建和部署。还可以直接在平台上提交作业并进行实时监控,确保作业的稳定运行。这个功能增强了项目管理的自动化、一致性和安全性,大幅提升了开发和部署效率。
 
+## 功能介绍
+
+StreamPark 的 "项目管理" 功能简化了 Apache Flink 和 Apache Spark 
项目的管理和部署流程。通过集成代码仓库,用户可以自动拉取代码、指定分支,并使用 Maven 自动化构建 Jar 文件。StreamPark 提供了 CI/CD 
能力,能够很方便的构建项目,还可以直接在平台上提交作业并进行实时监控,确保作业的稳定运行。这个功能增强了项目管理的自动化、一致性和安全性,提升了开发和部署效率。
+
 ## 项目管理
 
 Apache Flink 和 Apache Spark 都提供了基于 Java 代码来开发作业的编程模式,针对这类作业,需要用户编写好程序,构建打包产生目标 
Jar ,最后执行命令行(`flink run` 或者 `spark-submit` ) 来提交作业。

Reply via email to