This is an automated email from the ASF dual-hosted git repository.

zhongjiajie pushed a commit to branch dev
in repository https://gitbox.apache.org/repos/asf/dolphinscheduler.git


The following commit(s) were added to refs/heads/dev by this push:
     new 6cf9d088ff [doc] Change tasks docs (#10751)
6cf9d088ff is described below

commit 6cf9d088ff7fd6e9aeeb3103b0827ba3d287679b
Author: sneh-wha <[email protected]>
AuthorDate: Mon Jul 4 08:59:59 2022 +0530

    [doc] Change tasks docs (#10751)
---
 docs/docs/en/guide/task/conditions.md |  2 +-
 docs/docs/en/guide/task/emr.md        |  4 ++--
 docs/docs/en/guide/task/flink.md      | 28 ----------------------------
 docs/docs/en/guide/task/http.md       |  2 +-
 docs/docs/en/guide/task/jupyter.md    |  4 ++--
 docs/docs/en/guide/task/mlflow.md     |  7 ++++++-
 6 files changed, 12 insertions(+), 35 deletions(-)

diff --git a/docs/docs/en/guide/task/conditions.md 
b/docs/docs/en/guide/task/conditions.md
index 9ffd8e3db3..75576cdd46 100644
--- a/docs/docs/en/guide/task/conditions.md
+++ b/docs/docs/en/guide/task/conditions.md
@@ -4,7 +4,7 @@ Condition is a conditional node, that determines which 
downstream task should ru
 
 ## Create Task
 
-- Click `Project Management -> Project Name -> Workflow Definition`, and click 
the "`Create Workflow`" button to enter the DAG editing page.
+- Click `Project Management -> Project Name -> Workflow Definition`, and click 
the `Create Workflow` button to enter the DAG editing page.
 - Drag from the toolbar <img src="../../../../img/conditions.png" width="20"/> 
task node to canvas.
 
 ## Task Parameters
diff --git a/docs/docs/en/guide/task/emr.md b/docs/docs/en/guide/task/emr.md
index 9d8021ad4a..050d7c2397 100644
--- a/docs/docs/en/guide/task/emr.md
+++ b/docs/docs/en/guide/task/emr.md
@@ -6,8 +6,8 @@ Amazon EMR task type, for creating EMR clusters on AWS and 
running computing tas
 
 ## Create Task
 
-*   Click `Project Management -> Project Name -> Workflow Definition`, click 
the "`Create Workflow`" button to enter the DAG editing page.
-*   Drag `AmazonEMR` task from the toolbar to the artboard to complete the 
creation.
+* Click `Project Management -> Project Name -> Workflow Definition`, click the 
`Create Workflow` button to enter the DAG editing page.
+* Drag `AmazonEMR` task from the toolbar to the artboard to complete the 
creation.
 
 ## Task Parameters
 
diff --git a/docs/docs/en/guide/task/flink.md b/docs/docs/en/guide/task/flink.md
index 44c9461803..b32080ac67 100644
--- a/docs/docs/en/guide/task/flink.md
+++ b/docs/docs/en/guide/task/flink.md
@@ -45,34 +45,6 @@ Flink task type, used to execute Flink programs. For Flink 
nodes:
 | Resource | Appoint resource files in the `Resource` if parameters refer to 
them. |
 | Custom parameter | It is a local user-defined parameter for Flink, and will 
replace the content with `${variable}` in the script. |
 | Predecessor task | Selecting a predecessor task for the current task, will 
set the selected predecessor task as upstream of the current task. |
-- **Node name**: The node name in a workflow definition is unique.
-- **Run flag**: Identifies whether this node schedules normally, if it does 
not need to execute, select the `prohibition execution`.
-- **Descriptive information**: Describe the function of the node.
-- **Task priority**: When the number of worker threads is insufficient, 
execute in the order of priority from high to low, and tasks with the same 
priority will execute in a first-in first-out order.
-- **Worker grouping**: Assign tasks to the machines of the worker group to 
execute. If `Default` is selected, randomly select a worker machine for 
execution.
-- **Environment Name**: Configure the environment name in which run the script.
-- **Times of failed retry attempts**: The number of times the task failed to 
resubmit.
-- **Failed retry interval**: The time interval (unit minute) for resubmitting 
the task after a failed task.
-- **Delayed execution time**: The time (unit minute) that a task delays in 
execution.
-- **Timeout alarm**: Check the timeout alarm and timeout failure. When the 
task runs exceed the "timeout", an alarm email will send and the task execution 
will fail.
-- **Program type**: Support Java, Scala, Python and SQL four languages.
-- **The class of main function**: The **full path** of Main Class, the entry 
point of the Flink program.
-- **Main jar package**: The jar package of the Flink program (upload by 
Resource Center).
-- **Deployment mode**: Support 3 deployment modes: cluster, local and 
application (Flink 1.11 and later. See also [Run an application in Application 
Mode](https://nightlies.apache.org/flink/flink-docs-release-1.11/ops/deployment/yarn_setup.html#run-an-application-in-application-mode)).
-- **Initialization script**: Script file to initialize session context.
-- **Script**: The sql script file developed by the user that should be 
executed.
-- **Flink version**: Select version according to the execution env.
-- **Task name** (optional): Flink task name.
-- **JobManager memory size**: Used to set the size of jobManager memories, 
which can be set according to the actual production environment.
-- **Number of slots**: Used to set the number of slots, which can be set 
according to the actual production environment.
-- **TaskManager memory size**: Used to set the size of taskManager memories, 
which can be set according to the actual production environment.
-- **Number of TaskManager**: Used to set the number of taskManagers, which can 
be set according to the actual production environment.
-- **Parallelism**: Used to set the degree of parallelism for executing Flink 
tasks.
-- **Main program parameters**: Set the input parameters for the Flink program 
and support the substitution of custom parameter variables.
-- **Optional parameters**: Support `--jar`, `--files`,` --archives`, `--conf` 
format.
-- **Resource**: Appoint resource files in the `Resource` if parameters refer 
to them.
-- **Custom parameter**: It is a local user-defined parameter for Flink, and 
will replace the content with `${variable}` in the script.
-- **Predecessor task**: Selecting a predecessor task for the current task, 
will set the selected predecessor task as upstream of the current task.
 
 ## Task Example
 
diff --git a/docs/docs/en/guide/task/http.md b/docs/docs/en/guide/task/http.md
index 9aa10500b1..75509afb6b 100644
--- a/docs/docs/en/guide/task/http.md
+++ b/docs/docs/en/guide/task/http.md
@@ -6,7 +6,7 @@ This node is used to perform http type tasks such as the common 
POST and GET req
 
 ## Create Task
 
--  Click `Project Management -> Project Name -> Workflow Definition`, and 
click the "`Create Workflow`" button to enter the DAG editing page.
+-  Click `Project Management -> Project Name -> Workflow Definition`, and 
click the `Create Workflow` button to enter the DAG editing page.
 - Drag the <img src="../../../../img/tasks/icons/http.png" width="15"/> from 
the toolbar to the drawing board.
 
 ## Task Parameters
diff --git a/docs/docs/en/guide/task/jupyter.md 
b/docs/docs/en/guide/task/jupyter.md
index b69aad6374..648c42a651 100644
--- a/docs/docs/en/guide/task/jupyter.md
+++ b/docs/docs/en/guide/task/jupyter.md
@@ -42,13 +42,13 @@ Click [here](https://docs.conda.io/en/latest/) for more 
information about `conda
 └── ssl
 ```   
 
-> NOTE: Please follow the `conda pack` instructions above strictly, and DO NOT 
modify `bin/activate`.
+> NOTICE: Please follow the `conda pack` instructions above strictly, and DO 
NOT modify `bin/activate`.
 > `Jupyter Task Plugin` uses `source` command to activate your packed conda 
 > environment.
 > If you are concerned about using `source`, choose other options to manage 
 > your python dependency.   
 
 ## Create Task
 
-- Click `Project Management-Project Name-Workflow Definition`, and click the 
"`Create Workflow`" button to enter the DAG editing page.
+- Click `Project Management-Project Name-Workflow Definition`, and click the 
`Create Workflow` button to enter the DAG editing page.
 - Drag <img src="../../../../img/tasks/icons/jupyter.png" width="15"/> from 
the toolbar to the canvas.
 
 ## Task Parameters
diff --git a/docs/docs/en/guide/task/mlflow.md 
b/docs/docs/en/guide/task/mlflow.md
index 5c61660481..e4af9a15e3 100644
--- a/docs/docs/en/guide/task/mlflow.md
+++ b/docs/docs/en/guide/task/mlflow.md
@@ -117,9 +117,14 @@ You can now use this feature to run all MLFlow projects on 
Github (For example [
 
 
![mlflow-models-docker-compose](../../../../img/tasks/demo/mlflow-models-docker-compose.png)
 
+| **Parameter** | **Description** |
+| ------- | ---------- |
+| Max Cpu Limit | For example, `1.0` or `0.5`, the same as docker compose. |
+| Max Memory Limit | For example `1G` or `500M`, the same as docker compose. |
+
 ## Environment to Prepare
 
-### Conda environment
+### Conda Environment
 
 You need to enter the admin account to configure a conda environment 
variable(Please
 install [anaconda](https://docs.continuum.io/anaconda/install/)

Reply via email to