This is an automated email from the ASF dual-hosted git repository.
peacewong pushed a commit to branch dev
in repository https://gitbox.apache.org/repos/asf/linkis-website.git
The following commit(s) were added to refs/heads/dev by this push:
new 0b4ee1725c3 Modify the repl and spark ec documents (#773)
0b4ee1725c3 is described below
commit 0b4ee1725c3d3d8e37b381fd2a11bf63da5e42b9
Author: ChengJie1053 <[email protected]>
AuthorDate: Sat Dec 16 22:47:20 2023 +0800
Modify the repl and spark ec documents (#773)
* Modify the repl and spark ec documents
* Modify spark.md
---
docs/engine-usage/repl.md | 17 +++++++++++++++++
docs/engine-usage/spark.md | 17 +++++++++++++++++
.../current/engine-usage/repl.md | 17 +++++++++++++++++
.../current/engine-usage/spark.md | 17 +++++++++++++++++
4 files changed, 68 insertions(+)
diff --git a/docs/engine-usage/repl.md b/docs/engine-usage/repl.md
index a2f54a5347b..7a15ceab258 100644
--- a/docs/engine-usage/repl.md
+++ b/docs/engine-usage/repl.md
@@ -75,6 +75,7 @@ select * from linkis_cg_engine_conn_plugin_bml_resources;
### 3.1 Submit `java` tasks through `Linkis-cli`
+Single method
```shell
sh bin/linkis-cli -engineType repl-1 -code \
"import org.apache.commons.lang3.StringUtils;
@@ -85,6 +86,22 @@ select * from linkis_cg_engine_conn_plugin_bml_resources;
-codeType repl -runtimeMap linkis.repl.type=java
```
+Multiple methods
+```shell
+ sh bin/linkis-cli -engineType repl-1 -code \
+"import org.apache.commons.lang3.StringUtils;
+
+ public void sayHello() {
+ System.out.println(\"hello\");
+ System.out.println(StringUtils.isEmpty(\"hello\"));
+ }
+ public void sayHi() {
+ System.out.println(\"hi\");
+ System.out.println(StringUtils.isEmpty(\"hi\"));
+ }" \
+ -codeType repl -runtimeMap linkis.repl.type=java -runtimeMap
linkis.repl.method.name=sayHi
+```
+
### 3.2 Submit `scala` tasks through `Linkis-cli`
```shell
diff --git a/docs/engine-usage/spark.md b/docs/engine-usage/spark.md
index dab25c22f10..be9f208f2ae 100644
--- a/docs/engine-usage/spark.md
+++ b/docs/engine-usage/spark.md
@@ -180,6 +180,23 @@ Token-User: linkis
### 3.5 Submitting spark yarn cluster tasks via `Linkis-cli`
+Upload the jar package and configuration
+```shell
+# Upload the jar package under the lib of the linkis spark engine (modify the
following parameters according to your actual installation directory)
+cd /appcom/Install/linkis/lib/linkis-engineconn-plugins/spark/dist/3.2.1/lib
+hdfs dfs -put *.jar hdfs:///spark/cluster
+
+# Upload the linkis configuration file (modify the following parameters
according to your actual installation directory)
+cd /appcom/Install/linkis/conf
+hdfs dfs -put * hdfs:///spark/cluster
+
+# Upload hive-site.xml (modify the following parameters according to your
actual installation directory)
+cd $HIVE_CONF_DIR
+hdfs dfs -put hive-site.xml hdfs:///spark/cluster
+```
+Can pass `linkis.spark.yarn.cluster.jars`parameters to
modify`hdfs:///spark/cluster`
+
+Execute the test case
```shell
# Use `engingeConnRuntimeMode=yarnCluster` to specify the yarn cluster mode
sh ./bin/linkis-cli -engineType spark-3.2.1 -codeType sql -labelMap
engingeConnRuntimeMode=yarnCluster -submitUser hadoop -proxyUser hadoop -code
"select 123"
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/repl.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/repl.md
index ff78fd35a39..4e65f5cc363 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/repl.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/repl.md
@@ -73,6 +73,7 @@ select * from linkis_cg_engine_conn_plugin_bml_resources;
### 3.1 通过 `Linkis-cli` 提交`java`任务
+单个方法
```shell
sh bin/linkis-cli -engineType repl-1 -code \
"import org.apache.commons.lang3.StringUtils;
@@ -83,6 +84,22 @@ select * from linkis_cg_engine_conn_plugin_bml_resources;
-codeType repl -runtimeMap linkis.repl.type=java
```
+多个方法
+```shell
+ sh bin/linkis-cli -engineType repl-1 -code \
+"import org.apache.commons.lang3.StringUtils;
+
+ public void sayHello() {
+ System.out.println(\"hello\");
+ System.out.println(StringUtils.isEmpty(\"hello\"));
+ }
+ public void sayHi() {
+ System.out.println(\"hi\");
+ System.out.println(StringUtils.isEmpty(\"hi\"));
+ }" \
+ -codeType repl -runtimeMap linkis.repl.type=java -runtimeMap
linkis.repl.method.name=sayHi
+```
+
### 3.2 通过 `Linkis-cli` 提交`scala`任务
```shell
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/spark.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/spark.md
index 8374d05b6e5..09c31460f13 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/spark.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/spark.md
@@ -178,6 +178,23 @@ Token-User: linkis
### 3.5 通过 `Linkis-cli` 提交spark yarn cluster任务
+上传jar包和配置
+```shell
+# 上传linkis spark引擎的lib下的jar包 (根据您的实际安装目录修改以下参数)
+cd /appcom/Install/linkis/lib/linkis-engineconn-plugins/spark/dist/3.2.1/lib
+hdfs dfs -put *.jar hdfs:///spark/cluster
+
+# 上传linkis 配置文件 (根据您的实际安装目录修改以下参数)
+cd /appcom/Install/linkis/conf
+hdfs dfs -put * hdfs:///spark/cluster
+
+# 上传hive-site.xml (根据您的实际安装目录修改以下参数)
+cd $HIVE_CONF_DIR
+hdfs dfs -put hive-site.xml hdfs:///spark/cluster
+```
+可以通过`linkis.spark.yarn.cluster.jars`参数来修改`hdfs:///spark/cluster`
+
+执行测试用例
```shell
# 使用 `engingeConnRuntimeMode=yarnCluster` 来指定yarn cluster模式
sh ./bin/linkis-cli -engineType spark-3.2.1 -codeType sql -labelMap
engingeConnRuntimeMode=yarnCluster -submitUser hadoop -proxyUser hadoop -code
"select 123"
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]