This is an automated email from the ASF dual-hosted git repository.
casion pushed a commit to branch dev
in repository https://gitbox.apache.org/repos/asf/linkis-website.git
The following commit(s) were added to refs/heads/dev by this push:
new e0167a247f5 update deploy-quick.md (#755)
e0167a247f5 is described below
commit e0167a247f5d28b108564db8d2691d4235aa3ce5
Author: 赵文恺 <[email protected]>
AuthorDate: Wed Oct 25 10:01:26 2023 +0800
update deploy-quick.md (#755)
* Update deploy-quick.md 增加默认引擎、非默认引擎检查说明
* Update deploy-quick.md
* Update deploy-quick.md
* Update deploy-quick.md
* Update deploy-quick.md
* update deploy-quick.md in such chapter:(Chinese&English)
1.3 checking list for engines
7.3 additional engines chec
* update deploy-quick.md in such chapter:(Chinese&English)
1.3 modify dependencies
7.3 add parameters define for engines
* update deploy-quick.md for addtional Engines
1.3 modify dependencies
7.3 add parameters define for engines
* update deploy-quick.md for addtional Engines
1.3 modify dependencies
7.3 add parameters define for engines
* update deploy-quick.md for addtional Engines
1.3 modify dependencies
7.3 add parameters define for engines
* update deploy-quick.md for addtional Engines
1.3 modify dependencies
7.3 add parameters define for engines
* update deploy-quick.md for addtional Engines
1.3 modify dependencies
7.3 add parameters define for engines
---------
Co-authored-by: peter.peng <[email protected]>
---
docs/deployment/deploy-quick.md | 56 ++++++++++++++++------
.../current/deployment/deploy-quick.md | 45 +++++++++++++++--
2 files changed, 82 insertions(+), 19 deletions(-)
diff --git a/docs/deployment/deploy-quick.md b/docs/deployment/deploy-quick.md
index c755e1954ea..f273649172e 100644
--- a/docs/deployment/deploy-quick.md
+++ b/docs/deployment/deploy-quick.md
@@ -36,6 +36,19 @@ hadoop ALL=(ALL) NOPASSWD: NOPASSWD: ALL
<font color='red'>The following operations are performed under the hadoop
user</font>
+### 1.3 installation dependencies
+
+Linkis depends on such engines as bellow,all the mandatory engines will be
check in installation scripts`${LINKIS_HOME}/bin/checkEnv.sh`
+
+| EngineType | Necessary | Installation Guide
|
+|----------------|-----------|--------------------------------------------------------------------------------------------------------------------|
+| JDK(1.8.0 141) | mandatory | [Install JDK and setting
JAVA_HOME](https://docs.oracle.com/cd/E19509-01/820-5483/6ngsiu065/index.html)
|
+| mysql(5.5+) | mandatory | [MySQL
installation](https://docs.oracle.com/cd/E69403_01/html/E56873/mysql.html)
|
+| Python(3.6.8) | mandatory | [Python installation and user
guide](https://docs.python.org/zh-cn/3/using/index.html)
|
+| Nginx(1.14.1) | mandatory | [Nginx
installation](http://nginx.org/en/linux_packages.html#instructions)
|
+| Hadoop((2.7.2) | mandatory | [Hadoop
quickstart](https://hadoop.apache.org/docs/r1.0.4/cn/quickstart.html#%E5%AE%89%E8%A3%85%E8%BD%AF%E4%BB%B6)
|
+| Spark(2.4.3) | mandatory | [Spark download and
installtion](https://spark.apache.org/downloads.html)
|
+| Hive(3.1.3) | mandatory | [Hive
installation](https://cwiki.apache.org/confluence/display/hive/adminmanual+installation)
|
## 2. Configuration modification
@@ -586,24 +599,37 @@ linkis-package/lib/linkis-engineconn-plugins/
└── plugin
└── 3.2.1
```
-
#### Method 2: View the database table of linkis
```shell script
select * from linkis_cg_engine_conn_plugin_bml_resources
-```
-
-
-## 8. Troubleshooting guidelines for common abnormal problems
-### 8.1. Yarn queue check
-
->If you need to use the spark/hive/flink engine
-
-After logging in, check whether the yarn queue resources can be displayed
normally (click the button in the lower right corner of the page) (you need to
install the front end first).
-
-Normal as shown in the figure below:
-
-
-If it cannot be displayed: You can adjust it according to the following
guidelines
+````
+### 7.3 additional engines check
+The additonal engines check are done manually by executing the script`sh
$LINKIS_HOME/bin/checkAdd.sh ${engineType}`. Please refer to the directory
(`$LINKIS_HOME/bin/checkAdd.sh`) . The specific checking method is as follows:
+```shell script
+function print_usage(){
+ echo "Usage: checkAdd [EngineName]"
+ echo " EngineName : The Engine name that you want to check"
+ echo " Engine list as bellow: JDBC Flink openLooKeng Pipeline Presto Sqoop
Elasticsearch "
+}
+
+```
+The parameters used in the addtional engines checking process are divided into
two categories: one for the data engine connection information, defined in
`$LINKIS_HOME/deploy-config/db.sh`; the other is the reference parameters,
including check switches, version definitions, Java paths, etc., defined in
`$LINKIS_HOME/deploy-config/db.sh`. engines and parameters descriptions are as
follows:
+| EngineType | Parameters | parameter description |
+|---------------|--------------------|----------------------|
+| JDBC | ${MYSQL_HOST}, ${MYSQL_PORT}, ${MYSQL_DB}, ${MYSQL_USER},
${MYSQL_PASSWORD} | MySQL engine connection information, including host IP,
port, database name, username, password|
+| JDBC | ${MYSQL_CONNECT_JAVA_PATH} | MySQL JDBC Driver directory|
+| Flink | ${FLINK_HOME} | he installation directory of Flink,
including Flink execution scripts and samples |
+| openLooKeng | ${OLK_HOST}, ${OLK_PORT}, ${OLK_CATALOG}, ${OLK_SCHEMA},
{OLK_USER}, ${OLK_PASSWORD}| openLooKeng engine connection information,
including host IP, port, catalog, schema, username, password|
+| openLooKeng | ${OLK_JDBC_PATH} | openLooKeng connector directory|
+| Presto | ${PRESTO_HOST}, ${PRESTO_PORT}, ${PRESTO_CATALOG},
${PRESTO_SCHEMA}| Presto engine connection information, including host IP,
port, catalog, schema|
+| Sqoop | ${HIVE_META_URL}, ${HIVE_META_USER}, ${HIVE_META_PASSWORD}|
sqoop connection information for connecting to Hive, including service address,
username, password|
+| Elasticsearch | ${ES_RESTFUL_URL} | Elasticsearch RestfulAPI URL |
+| Impala | ${IMPALA_HOST}, ${IMPALA_PORT}| Impala connection
information, including host IP and port|
+| Trino | ${TRINO_COORDINATOR_HOST}, ${TRINO_COORDINATOR_PORT},
${TRINO_COORDINATOR_CATALOG}, ${TRINO_COORDINATOR_SCHEMA}| Trino connection
information, including host IP, port, catalog, and schema|
+| Seatunnel | ${SEATUNNEL_HOST}, ${SEATUNNEL_PORT} | Seatunnel connection
information, including host IP and port|
+
+## 8. Troubleshooting Guidelines for Common Abnormal Problems
+### 8.1. Yarn Queue Check
#### 8.1.1 Check whether the yarn address is configured correctly
Database table `linkis_cg_rm_external_resource_provider``
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/deployment/deploy-quick.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/deployment/deploy-quick.md
index 56541e69dd2..608f7db07af 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/deployment/deploy-quick.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/deployment/deploy-quick.md
@@ -12,9 +12,9 @@ sidebar_position: 1
### 1.2 添加部署用户
-
->部署用户: linkis核心进程的启动用户,同时此用户会默认作为管理员权限,<font
color="red">部署过程中会生成对应的管理员登录密码,位于`conf/linkis-mg-gateway.properties`文件中</font>
-Linkis支持指定提交、执行的用户。linkis主要进程服务会通过`sudo -u ${linkis-user}`
切换到对应用户下,然后执行对应的引擎启动命令,所以引擎`linkis-engine`进程归属的用户是任务的执行者(因此部署用户需要有sudo权限,而且是免密的)。
+
+> 部署用户: linkis核心进程的启动用户,同时此用户会默认作为管理员权限,<font
color="red">部署过程中会生成对应的管理员登录密码,位于`conf/linkis-mg-gateway.properties`文件中</font>
+> Linkis支持指定提交、执行的用户。linkis主要进程服务会通过`sudo -u ${linkis-user}`
切换到对应用户下,然后执行对应的引擎启动命令,所以引擎`linkis-engine`进程归属的用户是任务的执行者(因此部署用户需要有sudo权限,而且是免密的)。
以hadoop用户为例(<font
color="red">linkis中很多配置用户默认都使用hadoop用户,建议初次安装者使用hadoop用户,否则在安装过程中可能会遇到很多意想不到的错误</font>):
@@ -36,7 +36,19 @@ hadoop ALL=(ALL) NOPASSWD: NOPASSWD: ALL
<font color='red'>以下操作都是在hadoop用户下进行</font>
-
+### 1.3 依赖环境
+
+Linkis需要的环境引擎如下列表所示,这些必需的引擎在安装检查脚本`${LINKIS_HOME}/bin/checkENv.sh`中检查。
+
+| 引擎类型 | 是否必装 | 安装直通车
|
+|----------------|------|-------------------------------------------------------------------------------------------------------------|
+| JDK(1.8.0 141) | 必需 |
[安装JDK和设置JAVA_HOME](https://docs.oracle.com/cd/E19509-01/820-5483/6ngsiu065/index.html)
|
+| mysql(5.5+) | 必需 |
[安装MySQL](https://docs.oracle.com/cd/E69403_01/html/E56873/mysql.html)
|
+| Python(3.6.8) | 必需 |
[Python安装和使用](https://docs.python.org/zh-cn/3/using/index.html)
|
+| Nginx(1.14.1) | 必需 |
[Nginx安装指南](http://nginx.org/en/linux_packages.html#instructions)
|
+| Hadoop((2.7.2) | 必需 |
[Hadoop快速入门](https://hadoop.apache.org/docs/r1.0.4/cn/quickstart.html#%E5%AE%89%E8%A3%85%E8%BD%AF%E4%BB%B6)
|
+| Spark(2.4.3) | 必需 | [Spark安装入门](https://spark.apache.org/downloads.html)
|
+| Hive(3.1.3) | 必需 |
[Hive安装指南](https://cwiki.apache.org/confluence/display/hive/adminmanual+installation)
|
## 2. 配置修改
@@ -591,7 +603,32 @@ linkis-package/lib/linkis-engineconn-plugins/
```shell script
select * from linkis_cg_engine_conn_plugin_bml_resources
```
+### 7.3 非默认引擎检查
+非默认引擎的检查通过手工执行脚本`sh $LINKIS_HOME/bin/checkAdd.sh
${engineType}`来检查,具体脚本参见目录(`$LINKIS_HOME/bin/checkAdd.sh`) 。具体的检查方法如下:
+```shell script
+function print_usage(){
+ echo "Usage: checkAdd [EngineName]"
+ echo " EngineName : The Engine name that you want to check"
+ echo " Engine list as bellow: JDBC Flink openLooKeng Presto Sqoop
Elasticsearch "
+}
+```
+
+非默认引擎检查过程中使用到的参数分为两类:一类是数据引擎连接信息,在`$LINKIS_HOME/deploy-config/db.sh`中定义;另一类是引用参数,包括检查开关、版本定义、java路径等,在`$LINKIS_HOME/deploy-config/db.sh`定义。相关的引擎及参数描述如下:
+
+| 引擎类型 | 使用到的参数
| 参数描述
|
+|---------------|-----------------------------------------------------------------------------------------------------------------|------------------------------------------|
+| JDBC | ${MYSQL_HOST}, ${MYSQL_PORT}, ${MYSQL_DB}, ${MYSQL_USER},
${MYSQL_PASSWORD} |
MySQL引擎连接信息,包括主机IP、端口、数据库名、用户、密码 |
+| JDBC | ${MYSQL_CONNECT_JAVA_PATH}
| MySQL驱动连接所在目录
|
+| Flink | ${FLINK_HOME}
| 定义 FLink
安装所在目录,包含Flink执行脚本和样例 |
+| openLooKeng | ${OLK_HOST}, ${OLK_PORT}, ${OLK_CATALOG}, ${OLK_SCHEMA},
{OLK_USER}, ${OLK_PASSWORD} |
openLooKeng引擎连接信息,包括主机IP、端口、编目、模式、用户名、密码 |
+| openLooKeng | ${OLK_JDBC_PATH}
| openLooKeng连接器目录
|
+| Presto | ${PRESTO_HOST}, ${PRESTO_PORT}, ${PRESTO_CATALOG},
${PRESTO_SCHEMA} |
Presto引擎连接信息,包括主机IP、端口、编目、模式 |
+| Sqoop | ${HIVE_META_URL}, ${HIVE_META_USER}, ${HIVE_META_PASSWORD}
|
sqoop连接hive的连接信息,包括服务地址、用户名、密码 |
+| Elasticsearch | ${ES_RESTFUL_URL}
| Elasticsearch服务地址
|
+| Impala | ${IMPALA_HOST}, ${IMPALA_PORT}
| impala连接信息,包括主机 IP 、端口
|
+| Trino | ${TRINO_COORDINATOR_HOST}, ${TRINO_COORDINATOR_PORT},
${TRINO_COORDINATOR_CATALOG}, ${TRINO_COORDINATOR_SCHEMA} |
trino连接信息,包括主机IP、端口、类别、编目、模式 |
+| Seatunnel | ${SEATUNNEL_HOST}, ${SEATUNNEL_PORT}
| Seatunnel连接信息,包括主机IP、端口
|
## 8. 常见异常问题排查指引
### 8.1. Yarn队列检查
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]