This is an automated email from the ASF dual-hosted git repository.

benjobs pushed a commit to branch dev
in repository 
https://gitbox.apache.org/repos/asf/incubator-streampark-website.git


The following commit(s) were added to refs/heads/dev by this push:
     new b8b7895  Add some words about Docker deployment (#219)
b8b7895 is described below

commit b8b789556ec9b5a9a71e5035a9eafd732f021462
Author: VampireAchao <[email protected]>
AuthorDate: Tue Aug 22 10:35:46 2023 +0800

    Add some words about Docker deployment (#219)
    
    * Add some words about Docker deployment
---
 docs/user-guide/4-dockerDeployment.md              | 107 ++++++++++++++++++++-
 .../current/user-guide/4-dockerDeployment.md       | 102 +++++++++++++++++++-
 static/doc/image/streampark_docker_ls_hadoop.png   | Bin 0 -> 106300 bytes
 static/doc/image/streampark_docker_ps.png          | Bin 0 -> 140437 bytes
 4 files changed, 206 insertions(+), 3 deletions(-)

diff --git a/docs/user-guide/4-dockerDeployment.md 
b/docs/user-guide/4-dockerDeployment.md
index add232b..141270b 100644
--- a/docs/user-guide/4-dockerDeployment.md
+++ b/docs/user-guide/4-dockerDeployment.md
@@ -19,6 +19,8 @@ To start the service with docker-compose, you need to install 
[docker-compose](h
 
 ### StreamPark deployment based on h2 and docker-compose
 
+This method is suitable for beginners to learn and become familiar with the 
features. The configuration will reset after the container is restarted. Below, 
you can configure Mysql or Pgsql for persistence.
+
 #### Deployment
 
 ```html
@@ -52,7 +54,11 @@ wget 
https://raw.githubusercontent.com/apache/incubator-streampark/dev/deploy/do
 wget 
https://raw.githubusercontent.com/apache/incubator-streampark/dev/deploy/docker/mysql/.env
 vim .env
 ```
-Modify the corresponding connection information
+
+First, you need to create the "streampark" database in MySQL, and then 
manually execute the corresponding SQL found in the schema and data for the 
relevant data source.
+
+After that, modify the corresponding connection information.
+
 ```html
 SPRING_PROFILES_ACTIVE=mysql
 
SPRING_DATASOURCE_URL=jdbc:mysql://localhost:3306/streampark?useSSL=false&useUnicode=true&characterEncoding=UTF-8&allowPublicKeyRetrieval=false&useJDBCCompliantTimezoneShift=true&useLegacyDatetimeCode=false&serverTimezone=GMT%2B8
@@ -97,3 +103,102 @@ vim docker-compose
 ```
 docker-compose up -d
 ```
+
+## Docker-Compse Configuration
+
+The docker-compose.yaml file will reference the configuration from the env 
file, and the modified configuration is as follows:
+
+```yaml
+version: '3.8'
+services:
+  ## streampark-console container
+  streampark-console:
+    ## streampark image
+    image: apache/streampark:latest
+    ## streampark image startup command
+    command: ${
+   RUN_COMMAND}
+    ports:
+      - 10000:10000
+    ## Environment configuration file
+    env_file: .env
+    environment:
+      ## Declare environment variable
+      HADOOP_HOME: ${
+   HADOOP_HOME}
+    volumes:
+      - flink:/streampark/flink/${
+   FLINK}
+      - /var/run/docker.sock:/var/run/docker.sock
+      - /etc/hosts:/etc/hosts:ro
+      - ~/.kube:/root/.kube:ro
+    privileged: true
+    restart: unless-stopped
+    networks:
+      - streampark
+
+  ## flink-jobmanager container
+  flink-jobmanager:
+    image: ${
+   FLINK_IMAGE}
+    ports:
+      - "8081:8081"
+    command: jobmanager
+    volumes:
+      - flink:/opt/flink
+    env_file: .env
+    restart: unless-stopped
+    privileged: true
+    networks:
+      - streampark
+
+  ## streampark-taskmanager container
+  flink-taskmanager:
+    image: ${
+   FLINK_IMAGE}
+    depends_on:
+      - flink-jobmanager
+    command: taskmanager
+    deploy:
+      replicas: 1
+    env_file: .env
+    restart: unless-stopped
+    privileged: true
+    networks:
+      - streampark
+
+networks:
+  streampark:
+    driver: bridge
+
+volumes:
+  flink:
+```
+
+Finally, execute the start command:
+
+```sh
+cd deploy/docker
+docker-compose up -d
+```
+
+You can use `docker ps` to check if the installation was successful. If the 
following information is displayed, it indicates a successful installation:
+
+![](/doc/image/streampark_docker_ps.png)
+
+## Uploading Configuration to the Container
+
+In the previous `env` file, `HADOOP_HOME` was declared, with the corresponding 
directory being "/streampark/hadoop". Therefore, you need to upload the 
`/etc/hadoop` from the Hadoop installation package to the `/streampark/hadoop` 
directory. The commands are as follows:
+
+```sh
+## Upload Hadoop resources
+docker cp entire etc directory 
streampark-docker_streampark-console_1:/streampark/hadoop
+## Enter the container
+docker exec -it streampark-docker_streampark-console_1 bash
+## Check
+ls
+```
+
+![](/doc/image/streampark_docker_ls_hadoop.png)
+
+In addition, other configuration files, such as Maven's `settings.xml` file, 
are uploaded in the same manner.
\ No newline at end of file
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/user-guide/4-dockerDeployment.md
 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/user-guide/4-dockerDeployment.md
index 00b4ecf..0962b32 100644
--- 
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/user-guide/4-dockerDeployment.md
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/user-guide/4-dockerDeployment.md
@@ -19,7 +19,8 @@ sidebar_position: 4
 ## 快速StreamPark部署
 
 ### 基于h2和docker-compose进行StreamPark部署
-该方式适用于入门学习、熟悉功能特性
+
+该方式适用于入门学习、熟悉功能特性,容器重启后配置会失效,下方可以配置Mysql、Pgsql进行持久化
 #### 部署
 
 ```sh
@@ -54,7 +55,10 @@ wget 
https://raw.githubusercontent.com/apache/incubator-streampark/dev/deploy/do
 wget 
https://raw.githubusercontent.com/apache/incubator-streampark/dev/deploy/docker/mysql/.env
 vim .env
 ```
-修改对应的连接信息
+
+需要先在mysql先创建streampark数据库,然后手动执行对应的schema和data里面对应数据源的sql
+
+然后修改对应的连接信息
 ```sh
 SPRING_PROFILES_ACTIVE=mysql
 
SPRING_DATASOURCE_URL=jdbc:mysql://localhost:3306/streampark?useSSL=false&useUnicode=true&characterEncoding=UTF-8&allowPublicKeyRetrieval=false&useJDBCCompliantTimezoneShift=true&useLegacyDatetimeCode=false&serverTimezone=GMT%2B8
@@ -103,7 +107,101 @@ cd ../..
 ```
 ![](/doc/image/streampark_build.png)
 
+## docker-compse配置
+
+docker-compose.yaml会引用env文件的配置,修改后的配置如下:
+
+```yaml
+version: '3.8'
+services:
+  ## streampark-console容器
+  streampark-console:
+    ## streampark的镜像
+    image: apache/streampark:latest
+    ## streampark的镜像启动命令
+    command: ${
+   RUN_COMMAND}
+    ports:
+      - 10000:10000
+    ## 环境配置文件
+    env_file: .env
+    environment:
+      ## 声明环境变量
+      HADOOP_HOME: ${
+   HADOOP_HOME}
+    volumes:
+      - flink:/streampark/flink/${
+   FLINK}
+      - /var/run/docker.sock:/var/run/docker.sock
+      - /etc/hosts:/etc/hosts:ro
+      - ~/.kube:/root/.kube:ro
+    privileged: true
+    restart: unless-stopped
+    networks:
+      - streampark
+
+  ## flink-jobmanager容器
+  flink-jobmanager:
+    image: ${
+   FLINK_IMAGE}
+    ports:
+      - "8081:8081"
+    command: jobmanager
+    volumes:
+      - flink:/opt/flink
+    env_file: .env
+    restart: unless-stopped
+    privileged: true
+    networks:
+      - streampark
+
+  ## streampark-taskmanager容器
+  flink-taskmanager:
+    image: ${
+   FLINK_IMAGE}
+    depends_on:
+      - flink-jobmanager
+    command: taskmanager
+    deploy:
+      replicas: 1
+    env_file: .env
+    restart: unless-stopped
+    privileged: true
+    networks:
+      - streampark
+
+networks:
+  streampark:
+    driver: bridge
+
+volumes:
+  flink:
+```
+
+最后,执行启动命令:
+
 ```sh
 cd deploy/docker
 docker-compose up -d
 ```
+
+可以使用docker ps来查看是否安装成功,显示如下信息,表示安装成功:
+
+![](/doc/image/streampark_docker_ps.png)
+
+## 上传配置至容器
+
+在前面的env文件,声明了HADOOP_HOME,对应的目录为“/streampark/hadoop”,所以需要上传hadoop安装包下的/etc/hadoop至/streampark/hadoop目录,命令如下:
+
+```sh
+## 上传hadoop资源
+docker cp etc整个目录 streampark-docker_streampark-console_1:/streampark/hadoop
+## 进入容器
+docker exec -it streampark-docker_streampark-console_1 bash
+## 查看
+ls
+```
+
+![](/doc/image/streampark_docker_ls_hadoop.png)
+
+同时,其它配置文件,如maven的settings.xml文件也是以同样的方式上传。
\ No newline at end of file
diff --git a/static/doc/image/streampark_docker_ls_hadoop.png 
b/static/doc/image/streampark_docker_ls_hadoop.png
new file mode 100644
index 0000000..f478959
Binary files /dev/null and b/static/doc/image/streampark_docker_ls_hadoop.png 
differ
diff --git a/static/doc/image/streampark_docker_ps.png 
b/static/doc/image/streampark_docker_ps.png
new file mode 100644
index 0000000..60776cd
Binary files /dev/null and b/static/doc/image/streampark_docker_ps.png differ

Reply via email to