This is an automated email from the ASF dual-hosted git repository.
xincheng pushed a commit to branch dev
in repository https://gitbox.apache.org/repos/asf/dolphinscheduler.git
The following commit(s) were added to refs/heads/dev by this push:
new e4de06b5af [Doc-15500][Task] Update cli opts of spark and flink
(#15501)
e4de06b5af is described below
commit e4de06b5af1c42299a72988f356b8f611e8cf431
Author: Rick Cheng <[email protected]>
AuthorDate: Wed Jan 17 14:22:27 2024 +0800
[Doc-15500][Task] Update cli opts of spark and flink (#15501)
---
docs/docs/en/guide/task/flink.md | 2 +-
docs/docs/en/guide/task/spark.md | 2 +-
docs/docs/zh/guide/task/flink.md | 2 +-
docs/docs/zh/guide/task/spark.md | 2 +-
4 files changed, 4 insertions(+), 4 deletions(-)
diff --git a/docs/docs/en/guide/task/flink.md b/docs/docs/en/guide/task/flink.md
index b57606d4d1..b30af03efc 100644
--- a/docs/docs/en/guide/task/flink.md
+++ b/docs/docs/en/guide/task/flink.md
@@ -37,7 +37,7 @@ Flink task type, used to execute Flink programs. For Flink
nodes:
| Parallelism | Used to set the degree of parallelism for
executing Flink tasks.
|
| Yarn queue | Used to set the yarn queue, use `default` queue by
default.
|
| Main program parameters | Set the input parameters for the Flink program and
support the substitution of custom parameter variables.
|
-| Optional parameters | Support `--jar`, `--files`,` --archives`, `--conf`
format.
|
+| Optional parameters | Set the flink command options, such as `-D`, `-C`,
`-yt`.
|
| Custom parameter | It is a local user-defined parameter for Flink,
and will replace the content with `${variable}` in the script.
|
## Task Example
diff --git a/docs/docs/en/guide/task/spark.md b/docs/docs/en/guide/task/spark.md
index 88e6c61943..3e0f83b253 100644
--- a/docs/docs/en/guide/task/spark.md
+++ b/docs/docs/en/guide/task/spark.md
@@ -35,7 +35,7 @@ Spark task type for executing Spark application. When
executing the Spark task,
| Executor memory size | Set the size of Executor memories, which can be
set according to the actual production environment.
|
| Yarn queue | Set the yarn queue, use `default` queue by
default.
|
| Main program parameters | Set the input parameters of the Spark program
and support the substitution of custom parameter variables.
|
-| Optional parameters | Support `--jars`, `--files`,` --archives`,
`--conf` format.
|
+| Optional parameters | Set the spark command options, such as
`--jars`, `--files`,` --archives`, `--conf`.
|
| Resource | Appoint resource files in the `Resource` if
parameters refer to them.
|
| Custom parameter | It is a local user-defined parameter for Spark,
and will replace the content with `${variable}` in the script.
|
| Predecessor task | Selecting a predecessor task for the current
task, will set the selected predecessor task as upstream of the current task.
|
diff --git a/docs/docs/zh/guide/task/flink.md b/docs/docs/zh/guide/task/flink.md
index b15078e191..478e07fae8 100644
--- a/docs/docs/zh/guide/task/flink.md
+++ b/docs/docs/zh/guide/task/flink.md
@@ -37,7 +37,7 @@ Flink 任务类型,用于执行 Flink 程序。对于 Flink 节点:
| 并行度 | 用于设置执行 Flink 任务的并行度
|
| Yarn 队列 | 用于设置 Yarn 队列,默认使用 default 队列
|
| 主程序参数 | 设置 Flink 程序的输入参数,支持自定义参数变量的替换
|
-| 选项参数 | 支持 `--jar`、`--files`、`--archives`、`--conf` 格式
|
+| 选项参数 | 设置Flink命令的选项参数,例如`-D`, `-C`, `-yt`
|
| 自定义参数 | 是 Flink 局部的用户自定义参数,会替换脚本中以 ${变量} 的内容
|
## 任务样例
diff --git a/docs/docs/zh/guide/task/spark.md b/docs/docs/zh/guide/task/spark.md
index 641b6dcbf9..a392f55826 100644
--- a/docs/docs/zh/guide/task/spark.md
+++ b/docs/docs/zh/guide/task/spark.md
@@ -34,7 +34,7 @@ Spark 任务类型用于执行 Spark 应用。对于 Spark 节点,worker 支
- Executor 内存数:用于设置 Executor 内存数,可根据实际生产环境设置对应的内存数。
- Yarn 队列:用于设置 Yarn 队列,默认使用 default 队列。
- 主程序参数:设置 Spark 程序的输入参数,支持自定义参数变量的替换。
-- 选项参数:支持 `--jars`、`--files`、`--archives`、`--conf` 格式。
+- 选项参数:设置Spark命令的选项参数,例如`--jars`、`--files`、`--archives`、`--conf`。
- 资源:如果其他参数中引用了资源文件,需要在资源中选择指定。
- 自定义参数:是 Spark 局部的用户自定义参数,会替换脚本中以 ${变量} 的内容。