This is an automated email from the ASF dual-hosted git repository.
casion pushed a commit to branch dev
in repository https://gitbox.apache.org/repos/asf/incubator-linkis-website.git
The following commit(s) were added to refs/heads/dev by this push:
new 96dd489ea Fix some markdown link errors (#348)
96dd489ea is described below
commit 96dd489ea422c49c5abb2154c2b828fad1e01a0e
Author: Beacontownfc <[email protected]>
AuthorDate: Sat Jun 18 11:32:57 2022 +0800
Fix some markdown link errors (#348)
* Fix some markdown link errors
---
docs/api/linkis_task_operator.md | 2 +-
docs/api/login_api.md | 2 +-
docs/architecture/computation_governance_services/overview.md | 2 +-
.../job_submission_preparation_and_execution_process.md | 2 +-
docs/development/linkis_compile_and_package.md | 2 +-
docs/engine_usage/flink.md | 4 ++--
docs/engine_usage/hive.md | 4 ++--
docs/engine_usage/openlookeng.md | 2 +-
docs/engine_usage/pipeline.md | 2 +-
docs/engine_usage/spark.md | 4 ++--
docs/introduction.md | 2 +-
.../docusaurus-plugin-content-blog/2022-02-08-how-to-user-blog.md | 2 +-
.../2022-02-21-linkis-deploy/index.md | 4 ++--
.../current/download-logo.md | 6 +++---
i18n/zh-CN/docusaurus-plugin-content-docs-faq/current/main.md | 2 +-
.../current/api/linkis_task_operator.md | 2 +-
i18n/zh-CN/docusaurus-plugin-content-docs/current/api/login_api.md | 2 +-
.../architecture/computation_governance_services/overview.md | 2 +-
.../job_submission_preparation_and_execution_process.md | 2 +-
.../current/deployment/linkis_scriptis_install.md | 2 +-
.../current/deployment/quick_deploy.md | 4 ++--
.../current/development/linkis_compile_and_package.md | 2 +-
.../docusaurus-plugin-content-docs/current/engine_usage/flink.md | 2 +-
.../docusaurus-plugin-content-docs/current/engine_usage/hive.md | 4 ++--
.../docusaurus-plugin-content-docs/current/engine_usage/jdbc.md | 2 +-
.../current/engine_usage/openlookeng.md | 2 +-
.../docusaurus-plugin-content-docs/current/engine_usage/pipeline.md | 2 +-
.../docusaurus-plugin-content-docs/current/engine_usage/spark.md | 4 ++--
.../version-1.0.2/api/linkis_task_operator.md | 2 +-
.../docusaurus-plugin-content-docs/version-1.0.2/api/login_api.md | 2 +-
.../architecture/computation_governance_services/overview.md | 2 +-
.../job_submission_preparation_and_execution_process.md | 2 +-
.../version-1.0.2/deployment/quick_deploy.md | 2 +-
.../version-1.0.2/development/linkis_compile_and_package.md | 4 ++--
.../version-1.0.2/engine_usage/hive.md | 4 ++--
.../version-1.0.2/engine_usage/spark.md | 4 ++--
.../version-1.0.3/api/linkis_task_operator.md | 2 +-
.../docusaurus-plugin-content-docs/version-1.0.3/api/login_api.md | 2 +-
.../architecture/computation_governance_services/overview.md | 2 +-
.../job_submission_preparation_and_execution_process.md | 2 +-
.../version-1.0.3/deployment/quick_deploy.md | 2 +-
.../version-1.0.3/development/linkis_compile_and_package.md | 2 +-
.../version-1.0.3/engine_usage/flink.md | 2 +-
.../version-1.0.3/engine_usage/hive.md | 4 ++--
.../version-1.0.3/engine_usage/spark.md | 4 ++--
.../version-1.1.0/api/linkis_task_operator.md | 2 +-
.../docusaurus-plugin-content-docs/version-1.1.0/api/login_api.md | 2 +-
.../architecture/computation_governance_services/overview.md | 2 +-
.../job_submission_preparation_and_execution_process.md | 2 +-
.../version-1.1.0/deployment/quick_deploy.md | 2 +-
.../version-1.1.0/development/linkis_compile_and_package.md | 2 +-
.../version-1.1.0/engine_usage/flink.md | 2 +-
.../version-1.1.0/engine_usage/hive.md | 4 ++--
.../version-1.1.0/engine_usage/spark.md | 4 ++--
.../version-1.1.1/api/linkis_task_operator.md | 2 +-
.../docusaurus-plugin-content-docs/version-1.1.1/api/login_api.md | 2 +-
.../architecture/computation_governance_services/overview.md | 2 +-
.../job_submission_preparation_and_execution_process.md | 2 +-
.../version-1.1.1/deployment/quick_deploy.md | 4 ++--
.../version-1.1.1/development/linkis_compile_and_package.md | 2 +-
.../version-1.1.1/engine_usage/flink.md | 2 +-
.../version-1.1.1/engine_usage/hive.md | 4 ++--
.../version-1.1.1/engine_usage/jdbc.md | 2 +-
.../version-1.1.1/engine_usage/openlookeng.md | 2 +-
.../version-1.1.1/engine_usage/spark.md | 4 ++--
src/pages/team/config.json | 4 ++--
versioned_docs/version-1.0.2/api/linkis_task_operator.md | 2 +-
versioned_docs/version-1.0.2/api/login_api.md | 2 +-
.../architecture/computation_governance_services/overview.md | 2 +-
.../job_submission_preparation_and_execution_process.md | 2 +-
.../version-1.0.2/development/linkis_compile_and_package.md | 2 +-
versioned_docs/version-1.0.2/engine_usage/hive.md | 4 ++--
versioned_docs/version-1.0.2/engine_usage/spark.md | 4 ++--
versioned_docs/version-1.0.2/introduction.md | 2 +-
versioned_docs/version-1.0.3/api/linkis_task_operator.md | 2 +-
versioned_docs/version-1.0.3/api/login_api.md | 2 +-
.../architecture/computation_governance_services/overview.md | 2 +-
.../job_submission_preparation_and_execution_process.md | 2 +-
.../version-1.0.3/development/linkis_compile_and_package.md | 2 +-
versioned_docs/version-1.0.3/engine_usage/flink.md | 4 ++--
versioned_docs/version-1.0.3/engine_usage/hive.md | 4 ++--
versioned_docs/version-1.0.3/engine_usage/spark.md | 4 ++--
versioned_docs/version-1.0.3/introduction.md | 2 +-
versioned_docs/version-1.1.0/api/linkis_task_operator.md | 2 +-
versioned_docs/version-1.1.0/api/login_api.md | 2 +-
.../architecture/computation_governance_services/overview.md | 2 +-
.../job_submission_preparation_and_execution_process.md | 2 +-
.../version-1.1.0/development/linkis_compile_and_package.md | 2 +-
versioned_docs/version-1.1.0/engine_usage/flink.md | 4 ++--
versioned_docs/version-1.1.0/engine_usage/hive.md | 4 ++--
versioned_docs/version-1.1.0/engine_usage/spark.md | 4 ++--
versioned_docs/version-1.1.0/introduction.md | 2 +-
versioned_docs/version-1.1.1/api/linkis_task_operator.md | 2 +-
versioned_docs/version-1.1.1/api/login_api.md | 2 +-
.../architecture/computation_governance_services/overview.md | 2 +-
.../job_submission_preparation_and_execution_process.md | 2 +-
.../version-1.1.1/development/linkis_compile_and_package.md | 2 +-
versioned_docs/version-1.1.1/engine_usage/flink.md | 4 ++--
versioned_docs/version-1.1.1/engine_usage/hive.md | 4 ++--
versioned_docs/version-1.1.1/engine_usage/openlookeng.md | 2 +-
versioned_docs/version-1.1.1/engine_usage/spark.md | 4 ++--
versioned_docs/version-1.1.1/introduction.md | 2 +-
102 files changed, 133 insertions(+), 133 deletions(-)
diff --git a/docs/api/linkis_task_operator.md b/docs/api/linkis_task_operator.md
index b0027c13a..203fa570b 100644
--- a/docs/api/linkis_task_operator.md
+++ b/docs/api/linkis_task_operator.md
@@ -23,7 +23,7 @@ sidebar_position: 2
- data: return specific data.
- message: return the requested prompt message. If the status is not 0, the
message returned is an error message, and the data may have a stack field,
which returns specific stack information.
-For more information about the Linkis Restful interface specification, please
refer to: [Linkis Restful Interface
Specification](development/development_specification/api.md)
+For more information about the Linkis Restful interface specification, please
refer to: [Linkis Restful Interface
Specification](/community/development_specification/api)
### 1. Submit for Execution
diff --git a/docs/api/login_api.md b/docs/api/login_api.md
index 66fee0faf..522fb1105 100644
--- a/docs/api/login_api.md
+++ b/docs/api/login_api.md
@@ -60,7 +60,7 @@ We provide the following login-related interfaces:
- data: return specific data.
- message: return the requested prompt message. If the status is not 0, the
message returns an error message, and the data may have a stack field, which
returns specific stack information.
-For more information about the Linkis Restful interface specification, please
refer to: [Linkis Restful Interface
Specification](development/development_specification/api.md)
+For more information about the Linkis Restful interface specification, please
refer to: [Linkis Restful Interface
Specification](/community/development_specification/api)
### 1). Login In
diff --git a/docs/architecture/computation_governance_services/overview.md
b/docs/architecture/computation_governance_services/overview.md
index 062ccc91c..80c421ad7 100644
--- a/docs/architecture/computation_governance_services/overview.md
+++ b/docs/architecture/computation_governance_services/overview.md
@@ -33,7 +33,7 @@ Perform three stages to fully upgrade Linkis's Job execution
architecture, as sh
<!--
#todo Orchestrator documentation is not ready yet
-[Enter Orchestrator Architecture Design](orchestrator/overview.md)
+[Enter Orchestrator Architecture Design]()
-->
### 3. LinkisManager
diff --git
a/docs/architecture/job_submission_preparation_and_execution_process.md
b/docs/architecture/job_submission_preparation_and_execution_process.md
index cd7f483fa..dd6e8436c 100644
--- a/docs/architecture/job_submission_preparation_and_execution_process.md
+++ b/docs/architecture/job_submission_preparation_and_execution_process.md
@@ -102,7 +102,7 @@ The orchestration process of Linkis Orchestrator is similar
to many SQL parsing
<!--
#todo Orchestrator documentation is not ready yet
-Please refer to [Orchestrator Architecture
Design](architecture/orchestrator/orchestrator_architecture_doc.md) for more
details.
+Please refer to [Orchestrator Architecture Design]() for more details.
-->
After the analysis and arrangement of Linkis Orchestrator, the computing task
has been transformed into a executable physical tree. Orchestrator will submit
the Physical tree to Orchestrator's Execution module and enter the final
execution stage.
diff --git a/docs/development/linkis_compile_and_package.md
b/docs/development/linkis_compile_and_package.md
index 6fd8678fd..f90db5fd2 100644
--- a/docs/development/linkis_compile_and_package.md
+++ b/docs/development/linkis_compile_and_package.md
@@ -108,7 +108,7 @@ Get the installation package, there will be a compiled
package in the ->target d
incubator-linkis-x.x.x/linkis-engineconn-plugins/engineconn-plugins/spark/target/linkis-engineplugin-spark-x.x.x.jar
```
-How to install Spark engine separately? Please refer to [Linkis Engine Plugin
Installation Document](deployment/engine_conn_plugin_installation.md)
+How to install Spark engine separately? Please refer to [Linkis Engine Plugin
Installation Document](../deployment/engine_conn_plugin_installation)
## 5. How to modify the Hadoop, Hive, and Spark versions that Linkis depends on
diff --git a/docs/engine_usage/flink.md b/docs/engine_usage/flink.md
index ec566a99f..2e6124914 100644
--- a/docs/engine_usage/flink.md
+++ b/docs/engine_usage/flink.md
@@ -56,13 +56,13 @@ cd ${LINKIS_HOME}/sbin
sh linkis-daemon restart cg-engineplugin
```
A more detailed introduction to engineplugin can be found in the following
article.
-[EngineConnPlugin Installation](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin Installation](../deployment/engine_conn_plugin_installation)
### 2.3 Flink engine tags
Linkis1.0 is done through tags, so we need to insert data in our database, the
way of inserting is shown below.
-[EngineConnPlugin Installation > 2.2 Configuration modification of management
console (optional)](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin Installation > 2.2 Configuration modification of management
console (optional)](../deployment/engine_conn_plugin_installation)
## 3. The use of Flink engine
diff --git a/docs/engine_usage/hive.md b/docs/engine_usage/hive.md
index c01297337..bc52ab484 100644
--- a/docs/engine_usage/hive.md
+++ b/docs/engine_usage/hive.md
@@ -32,13 +32,13 @@ Other hive operating modes are similar, just copy the
corresponding dependencies
If you have already compiled your hive engineConn plug-in has been compiled,
then you need to put the new plug-in in the specified location to load, you can
refer to the following article for details
-[EngineConnPlugin Installation](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin Installation](../deployment/engine_conn_plugin_installation)
### 2.3 Linkis adds Hive console parameters(optional)
Linkis can configure the corresponding EngineConn parameters on the management
console. If your newly added EngineConn needs this feature, you can refer to
the following documents:
-[EngineConnPlugin Installation > 2.2 Configuration modification of management
console (optional)](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin Installation > 2.2 Configuration modification of management
console (optional)](../deployment/engine_conn_plugin_installation)
## 3. Use of hive engineConn
diff --git a/docs/engine_usage/openlookeng.md b/docs/engine_usage/openlookeng.md
index aff47b340..80586220b 100644
--- a/docs/engine_usage/openlookeng.md
+++ b/docs/engine_usage/openlookeng.md
@@ -47,7 +47,7 @@ sh linkis-daemon restart cg-engineplugin
Linkis1.X is done through tags, so we need to insert data into our database,
and the insertion method is as follows.
-[EngineConnPlugin engine plugin
installation](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin engine plugin
installation](../deployment/engine_conn_plugin_installation)
## 3 The use of the engine
diff --git a/docs/engine_usage/pipeline.md b/docs/engine_usage/pipeline.md
index ef393cc82..f35e192ed 100644
--- a/docs/engine_usage/pipeline.md
+++ b/docs/engine_usage/pipeline.md
@@ -62,7 +62,7 @@ select * from linkis_cg_engine_conn_plugin_bml_resources
Linkis1.XIt is carried out through labels, so it is necessary to insert data
into our database. The insertion method is shown below.
-[EngineConnPlugin Engine plug-in
installation](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin Engine plug-in
installation](../deployment/engine_conn_plugin_installation)
## 2 Use of engine
diff --git a/docs/engine_usage/spark.md b/docs/engine_usage/spark.md
index e5ec91168..106ebce8f 100644
--- a/docs/engine_usage/spark.md
+++ b/docs/engine_usage/spark.md
@@ -34,13 +34,13 @@ In theory, Linkis1.0 supports all versions of spark2.x and
above. Spark 2.4.3 is
If you have already compiled your spark EngineConn plug-in has been compiled,
then you need to put the new plug-in to the specified location to load, you can
refer to the following article for details
-[EngineConnPlugin Installation](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin Installation](../deployment/engine_conn_plugin_installation)
### 2.3 tags of spark EngineConn
Linkis1.0 is done through tags, so we need to insert data in our database, the
way of inserting is shown below.
-[EngineConnPlugin Installation > 2.2 Configuration modification of management
console (optional)](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin Installation > 2.2 Configuration modification of management
console (optional)](../deployment/engine_conn_plugin_installation)
## 3. Use of spark EngineConn
diff --git a/docs/introduction.md b/docs/introduction.md
index 39f726571..49fadcbe6 100644
--- a/docs/introduction.md
+++ b/docs/introduction.md
@@ -55,7 +55,7 @@ Please follow [Compile
Guide](development/linkis_compile_and_package.md) to comp
Please refer to [Deployment_Documents](deployment/quick_deploy.md) to do the
deployment.
## Examples and Guidance
-You can find examples and guidance for how to use and manage Linkis in
[User_Manual](user_guide/overview.md),
[Engine_Usage_Documents](engine_usage/overview.md) and
[API_Documents](dapi/overview.md).
+You can find examples and guidance for how to use and manage Linkis in
[User_Manual](user_guide/overview.md),
[Engine_Usage_Documents](engine_usage/overview.md) and
[API_Documents](../docs/api/overview.md).
## Documentation
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-blog/2022-02-08-how-to-user-blog.md
b/i18n/zh-CN/docusaurus-plugin-content-blog/2022-02-08-how-to-user-blog.md
index d8dc12db8..b293d956b 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-blog/2022-02-08-how-to-user-blog.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-blog/2022-02-08-how-to-user-blog.md
@@ -73,7 +73,7 @@ slug: welcome-docusaurus-v2
authors:
- name: Joel Marcey
title: Co-creator of Docusaurus 1
- url: https://github.com/JoelMarcey
+ url: https://github.com/JoelMarceyengine_conn_plugin_installation
image_url: https://github.com/JoelMarcey.png
- name: Sébastien Lorber
title: Docusaurus maintainer
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-blog/2022-02-21-linkis-deploy/index.md
b/i18n/zh-CN/docusaurus-plugin-content-blog/2022-02-21-linkis-deploy/index.md
index d1f2ada49..9ad852f98 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-blog/2022-02-21-linkis-deploy/index.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-blog/2022-02-21-linkis-deploy/index.md
@@ -269,7 +269,7 @@ Your default account password is [hadoop/5e8e312b4]`
### 3.4 添加mysql驱动(>=1.0.3)版本
因为license原因,linkis官方发布包中(dss集成的全家桶会包含,无需手动添加)移除了mysql-connector-java,需要手动添加
-具体参见[ 添加mysql驱动包](docs/latest/deployment/quick_deploy#-44-添加mysql驱动包)
+具体参见[ 添加mysql驱动包](/docs/latest/deployment/quick_deploy#-44-添加mysql驱动包)
### 3.5 启动服务
```shell script
@@ -434,7 +434,7 @@ select * from linkis_cg_engine_conn_plugin_bml_resources
查看引擎的物料记录是否存在(如果有更新,查看更新时间是否正确)。
-如果不存在或则未更新,先尝试手动刷新物料资源(详细见[引擎物料资源刷新](docs/latest/deployment/engine_conn_plugin_installation#23-引擎刷新))。通过`log/linkis-cg-engineplugin.log`日志,查看物料失败的具体原因,很多时候可能是hdfs目录没有权限导致,检查gateway地址配置是否正确`conf/linkis.properties:wds.linkis.gateway.url`
+如果不存在或则未更新,先尝试手动刷新物料资源(详细见[引擎物料资源刷新](/docs/latest/deployment/engine_conn_plugin_installation#23-引擎刷新))。通过`log/linkis-cg-engineplugin.log`日志,查看物料失败的具体原因,很多时候可能是hdfs目录没有权限导致,检查gateway地址配置是否正确`conf/linkis.properties:wds.linkis.gateway.url`
引擎的物料资源默认上传到hdfs目录为 `/apps-data/${deployUser}/bml`
```shell script
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs-download/current/download-logo.md
b/i18n/zh-CN/docusaurus-plugin-content-docs-download/current/download-logo.md
index 4644abdd9..0de469611 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs-download/current/download-logo.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs-download/current/download-logo.md
@@ -4,7 +4,7 @@ sidebar_position: 11
---
| Logo 名称 |图片示例| 下载地址 | 日期 |
|:------------|:----:|:----:|:----:|
-|Linkis 反白透明| <img
src={require('./../../../../static/logo/linkis-anti-white-transparent.png').default}
width="200" style={{ backgroundColor:'hsl(0, 0%, 90%)'}}
/>|[点击下载](../static/logo/linkis-anti-white-transparent.png)|2022-06-01|
-|Linkis 白底 |<img
src={require('./../../../../static/logo/linkis-white.png').default}
width="200"/> | [点击下载](../static/logo/linkis-white.png)|2022-06-01|
-|Linkis 透明底| <img
src={require('./../../../../static/logo/linkis-transparent.png').default}
width="200"/>|[点击下载](../static/logo/linkis-transparent.png)|2022-06-01|
+|Linkis 反白透明| <img
src={require('./../../../../static/logo/linkis-anti-white-transparent.png').default}
width="200" style={{ backgroundColor:'hsl(0, 0%, 90%)'}}
/>|[点击下载](./../../../../static/logo/linkis-anti-white-transparent.png)|2022-06-01|
+|Linkis 白底 |<img
src={require('./../../../../static/logo/linkis-white.png').default}
width="200"/> | [点击下载](./../../../../static/logo/linkis-white.png)|2022-06-01|
+|Linkis 透明底| <img
src={require('./../../../../static/logo/linkis-transparent.png').default}
width="200"/>|[点击下载](./../../../../static/logo/linkis-transparent.png)|2022-06-01|
|Linkis and WeDataSphere 组件及源文件 |-
|[点击下载](./../../../../static/logo/linkis-and-WeDataSphere-component.ai)|2022-06-01|
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs-faq/current/main.md
b/i18n/zh-CN/docusaurus-plugin-content-docs-faq/current/main.md
index 568bcff85..8109cdea6 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs-faq/current/main.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs-faq/current/main.md
@@ -344,7 +344,7 @@ Failed to async get EngineNode ErrorException: errCode: 0
,desc: operation fail
```
解决办法
-需要安装下对应的引擎插件,可以参考:[引擎安装指引](deployment/engine_conn_plugin_installation.md)
+需要安装下对应的引擎插件,可以参考:[引擎安装指引](/docs/latest/deployment/engine_conn_plugin_installation)
#### Q37.关闭资源检查
报错现象:资源不足
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/api/linkis_task_operator.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/api/linkis_task_operator.md
index 5a791a487..02b749fd4 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/api/linkis_task_operator.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/api/linkis_task_operator.md
@@ -18,7 +18,7 @@
- data:返回具体的数据。
- message:返回请求的提示信息。如果status非0时,message返回的是错误信息,其中data有可能存在stack字段,返回具体的堆栈信息。
-更多关于 Linkis Restful 接口的规范,请参考:[Linkis Restful
接口规范](development/development_specification/api.md)
+更多关于 Linkis Restful 接口的规范,请参考:[Linkis Restful
接口规范](/community/development_specification/api)
### 1. 提交执行
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/current/api/login_api.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/api/login_api.md
index c5e7cd2f7..91c83b455 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/api/login_api.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/api/login_api.md
@@ -63,7 +63,7 @@ wds.linkis.ldap.proxy.baseDN=dc=webank,dc=com # 您的LDAP服务的配置
- data:返回具体的数据。
- message:返回请求的提示信息。如果status非0时,message返回的是错误信息,其中data有可能存在stack字段,返回具体的堆栈信息。
-更多关于 Linkis Restful 接口的规范,请参考:[Linkis Restful
接口规范](development/development_specification/api.md)
+更多关于 Linkis Restful 接口的规范,请参考:[Linkis Restful
接口规范](/community/development_specification/api)
### 4.1 登录
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/architecture/computation_governance_services/overview.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/architecture/computation_governance_services/overview.md
index 20f60b382..65ea36c02 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/architecture/computation_governance_services/overview.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/architecture/computation_governance_services/overview.md
@@ -49,7 +49,7 @@ Linkis1.0将优化Job的整体执行流程,从提交 —\> 准备 —\>
<!--
#todo Orchestrator文档还没准备好!!
- [进入Orchestrator架构设计](orchestrator/overview.md)
+ [进入Orchestrator架构设计]()
-->
### 3、LinkisManager
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/architecture/job_submission_preparation_and_execution_process.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/architecture/job_submission_preparation_and_execution_process.md
index 8b4f426d9..d0a8dcfc8 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/architecture/job_submission_preparation_and_execution_process.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/architecture/job_submission_preparation_and_execution_process.md
@@ -118,7 +118,7 @@ Linkis Orchestrator的编排流程与很多SQL解析引擎(如Spark、Hive的S
<!--
#todo Orchestrator文档还没准备好!!
-关于Orchestrator的编排详细介绍,请参考:[Orchestrator架构设计](architecture/orchestrator/orchestrator_architecture_doc.md)
+关于Orchestrator的编排详细介绍,请参考:[Orchestrator架构设计]()
-->
经过了Linkis
Orchestrator的解析编排后,用户的计算任务已经转换成了一颗可被执行的Physical树。Orchestrator会将该Physical树提交给Orchestrator的Execution模块,进入最后的执行阶段。
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/deployment/linkis_scriptis_install.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/deployment/linkis_scriptis_install.md
index 589ccc718..ae4bbb2ba 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/deployment/linkis_scriptis_install.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/deployment/linkis_scriptis_install.md
@@ -8,7 +8,7 @@ sidebar_position: 10
> 在Linkis1.0和DSS 1.1.X之后,支持将Scritpis单独部署来集成Linkis,使用Scriptis的交互式分析的功能,可以在web
> 页面在线写SQL、Pyspark、HiveQL等脚本,提交给Linkis执行且支持UDF、函数、资源管控和自定义变量等特性,本文将介绍如何单独部署Web组件-Scriptis,并通过Scriptis这种Web页面来使用Linkis。
-前提:已经成功安装并可以正常使用了linkis服务(后端和管理台服务),linkis的部署流程可以见[Linkis的快速部署](deployment/quick_deploy)
+前提:已经成功安装并可以正常使用了linkis服务(后端和管理台服务),linkis的部署流程可以见[Linkis的快速部署](/docs/1.1.2/deployment/quick_deploy)
示例说明:
- linkis-gateway服务的地址为10.10.10.10 端口为9001
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/deployment/quick_deploy.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/deployment/quick_deploy.md
index 0bfc77c99..1e58fa54e 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/deployment/quick_deploy.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/deployment/quick_deploy.md
@@ -16,7 +16,7 @@ sidebar_position: 1
**如果您已经是 Linkis 的使用用户,安装或升级前建议先阅读:[Linkis1.0 与 Linkis0.X
的区别简述](architecture/difference_between_1.0_and_0.x.md)**。
-请注意:除了 Linkis1.0
安装包默认已经包含的:Python/Shell/Hive/Spark四个EngineConnPlugin以外,如果大家有需要,可以手动安装如 JDBC
引擎等类型的其他引擎,具体请参考
[EngineConnPlugin引擎插件安装文档](deployment/engine_conn_plugin_installation.md)。
+请注意:除了 Linkis1.0
安装包默认已经包含的:Python/Shell/Hive/Spark四个EngineConnPlugin以外,如果大家有需要,可以手动安装如 JDBC
引擎等类型的其他引擎,具体请参考
[EngineConnPlugin引擎插件安装文档](/docs/1.1.2/deployment/engine_conn_plugin_installation)。
Linkis1.0.3 默认已适配的引擎列表如下:
@@ -445,7 +445,7 @@ select * from linkis_cg_engine_conn_plugin_bml_resources
查看引擎的物料记录是否存在(如果有更新,查看更新时间是否正确)。
-如果不存在或则未更新,先尝试手动刷新物料资源(详细见[引擎物料资源刷新](docs/latest/deployment/engine_conn_plugin_installation#23-引擎刷新))。通过`log/linkis-cg-engineplugin.log`日志,查看物料失败的具体原因,很多时候可能是hdfs目录没有权限导致,检查gateway地址配置是否正确`conf/linkis.properties:wds.linkis.gateway.url`
+如果不存在或则未更新,先尝试手动刷新物料资源(详细见[引擎物料资源刷新](/docs/latest/deployment/engine_conn_plugin_installation#23-引擎刷新))。通过`log/linkis-cg-engineplugin.log`日志,查看物料失败的具体原因,很多时候可能是hdfs目录没有权限导致,检查gateway地址配置是否正确`conf/linkis.properties:wds.linkis.gateway.url`
引擎的物料资源默认上传到hdfs目录为 `/apps-data/${deployUser}/bml`
```shell script
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/development/linkis_compile_and_package.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/development/linkis_compile_and_package.md
index 3930c8e2e..2e8e50026 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/development/linkis_compile_and_package.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/development/linkis_compile_and_package.md
@@ -98,7 +98,7 @@ __编译环境要求:__ 必须 **JDK8** 以上,**Oracle/Sun** 和 **OpenJDK
incubator-linkis-x.x.x/linkis-engineconn-plugins/engineconn-plugins/spark/target/out/spark
```
-如何单独安装 Spark 引擎?请参考 [Linkis
引擎插件安装文档](deployment/engine_conn_plugin_installation.md)
+如何单独安装 Spark 引擎?请参考 [Linkis
引擎插件安装文档](../deployment/engine_conn_plugin_installation)
## 5. 如何修改Linkis的依赖的Hadoop、Hive、Spark版本
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine_usage/flink.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine_usage/flink.md
index 2528dc8fb..12979f32c 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine_usage/flink.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine_usage/flink.md
@@ -63,7 +63,7 @@
https://linkis.apache.org/zh-CN/docs/1.1.1/deployment/engine_conn_plugin_install
Linkis1.X是通过标签来进行的,所以需要在我们数据库中插入数据,插入的方式如下文所示。
-[EngineConnPlugin引擎插件安装](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin引擎插件安装](../deployment/engine_conn_plugin_installation)
## 3.Flink引擎的使用
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine_usage/hive.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine_usage/hive.md
index 9b59a9f8a..47116ad53 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine_usage/hive.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine_usage/hive.md
@@ -35,13 +35,13 @@ on Tez,需要您按照此pr进行一下修改。
如果您已经编译完了您的hive引擎的插件已经编译完成,那么您需要将新的插件放置到指定的位置中才能加载,具体可以参考下面这篇文章
-[EngineConnPlugin引擎插件安装](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin引擎插件安装](../deployment/engine_conn_plugin_installation)
### 2.3 hive引擎的标签
Linkis1.X是通过标签来进行的,所以需要在我们数据库中插入数据,插入的方式如下文所示。
-[EngineConnPlugin引擎插件安装 > 2.2
管理台Configuration配置修改(可选)](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin引擎插件安装 > 2.2
管理台Configuration配置修改(可选)](../deployment/engine_conn_plugin_installation)
## 3.hive引擎的使用
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine_usage/jdbc.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine_usage/jdbc.md
index f476e060c..fcd68a8f8 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine_usage/jdbc.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine_usage/jdbc.md
@@ -44,7 +44,7 @@ sh linkis-daemon.sh restart cg-engineplugin
Linkis1.X是通过标签来进行的,所以需要在我们数据库中插入数据,插入的方式如下文所示。
-[EngineConnPlugin引擎插件安装](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin引擎插件安装](../deployment/engine_conn_plugin_installation)
## 3.JDBC引擎的使用
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine_usage/openlookeng.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine_usage/openlookeng.md
index 751dd3b11..de4bb7a6a 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine_usage/openlookeng.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine_usage/openlookeng.md
@@ -47,7 +47,7 @@ sh linkis-daemon.sh restart cg-engineplugin
Linkis1.X是通过标签来进行的,所以需要在我们数据库中插入数据,插入的方式如下文所示。
-[EngineConnPlugin引擎插件安装](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin引擎插件安装](../deployment/engine_conn_plugin_installation)
## 3 引擎的使用
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine_usage/pipeline.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine_usage/pipeline.md
index b29a95338..dcc308de4 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine_usage/pipeline.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine_usage/pipeline.md
@@ -63,7 +63,7 @@ select * from linkis_cg_engine_conn_plugin_bml_resources
Linkis1.X是通过标签来进行的,所以需要在我们数据库中插入数据,插入的方式如下文所示。
-[EngineConnPlugin引擎插件安装](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin引擎插件安装](../deployment/engine_conn_plugin_installation)
## 2 引擎的使用
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine_usage/spark.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine_usage/spark.md
index 5180f33fe..6cad8cfc7 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine_usage/spark.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine_usage/spark.md
@@ -33,13 +33,13 @@ sidebar_position: 1
如果您已经编译完了您的spark引擎的插件,那么您需要将新的插件放置到指定的位置中才能加载,具体可以参考下面这篇文章
-[EngineConnPlugin引擎插件安装](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin引擎插件安装](../deployment/engine_conn_plugin_installation)
### 2.3 spark引擎的标签
Linkis1.X是通过标签配置来区分引擎版本的,所以需要我们在数据库中插入数据,插入的方式如下文所示。
-[EngineConnPlugin引擎插件安装 > 2.2
管理台Configuration配置修改(可选)](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin引擎插件安装 > 2.2
管理台Configuration配置修改(可选)](../deployment/engine_conn_plugin_installation)
## 3.spark引擎的使用
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2/api/linkis_task_operator.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2/api/linkis_task_operator.md
index 5a791a487..02b749fd4 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2/api/linkis_task_operator.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2/api/linkis_task_operator.md
@@ -18,7 +18,7 @@
- data:返回具体的数据。
- message:返回请求的提示信息。如果status非0时,message返回的是错误信息,其中data有可能存在stack字段,返回具体的堆栈信息。
-更多关于 Linkis Restful 接口的规范,请参考:[Linkis Restful
接口规范](development/development_specification/api.md)
+更多关于 Linkis Restful 接口的规范,请参考:[Linkis Restful
接口规范](/community/development_specification/api)
### 1. 提交执行
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2/api/login_api.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2/api/login_api.md
index c5e7cd2f7..91c83b455 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2/api/login_api.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2/api/login_api.md
@@ -63,7 +63,7 @@ wds.linkis.ldap.proxy.baseDN=dc=webank,dc=com # 您的LDAP服务的配置
- data:返回具体的数据。
- message:返回请求的提示信息。如果status非0时,message返回的是错误信息,其中data有可能存在stack字段,返回具体的堆栈信息。
-更多关于 Linkis Restful 接口的规范,请参考:[Linkis Restful
接口规范](development/development_specification/api.md)
+更多关于 Linkis Restful 接口的规范,请参考:[Linkis Restful
接口规范](/community/development_specification/api)
### 4.1 登录
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2/architecture/computation_governance_services/overview.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2/architecture/computation_governance_services/overview.md
index b07aa19a6..d6bc4d0ad 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2/architecture/computation_governance_services/overview.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2/architecture/computation_governance_services/overview.md
@@ -49,7 +49,7 @@ Linkis1.0将优化Job的整体执行流程,从提交 —\> 准备 —\>
<!--
#todo Orchestrator文档还没准备好!!
- [进入Orchestrator架构设计](orchestrator/overview.md)
+ [进入Orchestrator架构设计]()
-->
### 3、LinkisManager
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2/architecture/job_submission_preparation_and_execution_process.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2/architecture/job_submission_preparation_and_execution_process.md
index 8b4f426d9..d0a8dcfc8 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2/architecture/job_submission_preparation_and_execution_process.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2/architecture/job_submission_preparation_and_execution_process.md
@@ -118,7 +118,7 @@ Linkis Orchestrator的编排流程与很多SQL解析引擎(如Spark、Hive的S
<!--
#todo Orchestrator文档还没准备好!!
-关于Orchestrator的编排详细介绍,请参考:[Orchestrator架构设计](architecture/orchestrator/orchestrator_architecture_doc.md)
+关于Orchestrator的编排详细介绍,请参考:[Orchestrator架构设计]()
-->
经过了Linkis
Orchestrator的解析编排后,用户的计算任务已经转换成了一颗可被执行的Physical树。Orchestrator会将该Physical树提交给Orchestrator的Execution模块,进入最后的执行阶段。
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2/deployment/quick_deploy.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2/deployment/quick_deploy.md
index 5046a2740..a77bba08a 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2/deployment/quick_deploy.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2/deployment/quick_deploy.md
@@ -6,7 +6,7 @@ sidebar_position: 1
**如果您是首次接触并使用Linkis,您可以忽略该章节;如果您已经是
Linkis 的使用用户,安装或升级前建议先阅读:[Linkis1.0 与 Linkis0.X
的区别简述](architecture/difference_between_1.0_and_0.x.md)**。
- 请注意:除了 Linkis1.0
安装包默认已经包含的:Python/Shell/Hive/Spark四个EngineConnPlugin以外,如果大家有需要,可以手动安装如 JDBC
引擎等类型的其他引擎,具体请参考
[EngineConnPlugin引擎插件安装文档](deployment/engine_conn_plugin_installation.md)。
+ 请注意:除了 Linkis1.0
安装包默认已经包含的:Python/Shell/Hive/Spark四个EngineConnPlugin以外,如果大家有需要,可以手动安装如 JDBC
引擎等类型的其他引擎,具体请参考
[EngineConnPlugin引擎插件安装文档](../deployment/engine_conn_plugin_installation)。
**Linkis Docker镜像**
[Linkis 0.10.0
Docker](https://hub.docker.com/repository/docker/wedatasphere/linkis)
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2/development/linkis_compile_and_package.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2/development/linkis_compile_and_package.md
index 23ccc7332..f42a7c107 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2/development/linkis_compile_and_package.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2/development/linkis_compile_and_package.md
@@ -15,7 +15,7 @@ __编译环境要求:__ 必须 **JDK8** 以上,**Oracle/Sun** 和 **OpenJDK
**请注意**:官方推荐使用 Hadoop-2.7.2、Hive-1.2.1、Spark-2.4.3 和 Scala-2.11.12 对 Linkis
进行编译。
-如果您想使用 Hadoop、Hive、Spark 的其他版本对 Linkis
进行编译,请参考:[如何修改Linkis的依赖的Hadoop、Hive、Spark版本](5-如何修改linkis的依赖的hadoophivespark版本)
+如果您想使用 Hadoop、Hive、Spark 的其他版本对 Linkis
进行编译,请参考:[如何修改Linkis的依赖的Hadoop、Hive、Spark版本](#5-如何修改linkis的依赖的hadoophivespark版本)
## 2. 全量编译 Linkis
@@ -94,7 +94,7 @@ __编译环境要求:__ 必须 **JDK8** 以上,**Oracle/Sun** 和 **OpenJDK
wedatasphere-linkis-x.x.x/linkis-engineconn-plugins/engineconn-plugins/spark/target/linkis-engineplugin-spark-x.x.x.jar
```
-如何单独安装 Spark 引擎? 请参考 [Linkis
引擎插件安装文档](deployment/engine_conn_plugin_installation.md)
+如何单独安装 Spark 引擎? 请参考 [Linkis
引擎插件安装文档](../deployment/engine_conn_plugin_installation)
## 5. 如何修改Linkis的依赖的Hadoop、Hive、Spark版本
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2/engine_usage/hive.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2/engine_usage/hive.md
index a110e8d02..0303af038 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2/engine_usage/hive.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2/engine_usage/hive.md
@@ -35,13 +35,13 @@ on Tez,需要您按照此pr进行一下修改。
如果您已经编译完了您的hive引擎的插件已经编译完成,那么您需要将新的插件放置到指定的位置中才能加载,具体可以参考下面这篇文章
-[EngineConnPlugin引擎插件安装](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin引擎插件安装](../deployment/engine_conn_plugin_installation)
### 2.3 hive引擎的标签
Linkis1.0是通过标签来进行的,所以需要在我们数据库中插入数据,插入的方式如下文所示。
-[EngineConnPlugin引擎插件安装 > 2.2
管理台Configuration配置修改(可选)](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin引擎插件安装 > 2.2
管理台Configuration配置修改(可选)](../deployment/engine_conn_plugin_installation)
## 3.hive引擎的使用
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2/engine_usage/spark.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2/engine_usage/spark.md
index 92fd55438..15d85d429 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2/engine_usage/spark.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2/engine_usage/spark.md
@@ -33,13 +33,13 @@ sidebar_position: 1
如果您已经编译完了您的spark引擎的插件已经编译完成,那么您需要将新的插件放置到指定的位置中才能加载,具体可以参考下面这篇文章
-[EngineConnPlugin引擎插件安装](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin引擎插件安装](../deployment/engine_conn_plugin_installation)
### 2.3 spark引擎的标签
Linkis1.0是通过标签来进行的,所以需要在我们数据库中插入数据,插入的方式如下文所示。
-[EngineConnPlugin引擎插件安装 > 2.2
管理台Configuration配置修改(可选)](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin引擎插件安装 > 2.2
管理台Configuration配置修改(可选)](../deployment/engine_conn_plugin_installation)
## 3.spark引擎的使用
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/api/linkis_task_operator.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/api/linkis_task_operator.md
index 5a791a487..02b749fd4 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/api/linkis_task_operator.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/api/linkis_task_operator.md
@@ -18,7 +18,7 @@
- data:返回具体的数据。
- message:返回请求的提示信息。如果status非0时,message返回的是错误信息,其中data有可能存在stack字段,返回具体的堆栈信息。
-更多关于 Linkis Restful 接口的规范,请参考:[Linkis Restful
接口规范](development/development_specification/api.md)
+更多关于 Linkis Restful 接口的规范,请参考:[Linkis Restful
接口规范](/community/development_specification/api)
### 1. 提交执行
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/api/login_api.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/api/login_api.md
index c5e7cd2f7..91c83b455 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/api/login_api.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/api/login_api.md
@@ -63,7 +63,7 @@ wds.linkis.ldap.proxy.baseDN=dc=webank,dc=com # 您的LDAP服务的配置
- data:返回具体的数据。
- message:返回请求的提示信息。如果status非0时,message返回的是错误信息,其中data有可能存在stack字段,返回具体的堆栈信息。
-更多关于 Linkis Restful 接口的规范,请参考:[Linkis Restful
接口规范](development/development_specification/api.md)
+更多关于 Linkis Restful 接口的规范,请参考:[Linkis Restful
接口规范](/community/development_specification/api)
### 4.1 登录
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/architecture/computation_governance_services/overview.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/architecture/computation_governance_services/overview.md
index 20f60b382..65ea36c02 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/architecture/computation_governance_services/overview.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/architecture/computation_governance_services/overview.md
@@ -49,7 +49,7 @@ Linkis1.0将优化Job的整体执行流程,从提交 —\> 准备 —\>
<!--
#todo Orchestrator文档还没准备好!!
- [进入Orchestrator架构设计](orchestrator/overview.md)
+ [进入Orchestrator架构设计]()
-->
### 3、LinkisManager
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/architecture/job_submission_preparation_and_execution_process.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/architecture/job_submission_preparation_and_execution_process.md
index 8b4f426d9..d0a8dcfc8 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/architecture/job_submission_preparation_and_execution_process.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/architecture/job_submission_preparation_and_execution_process.md
@@ -118,7 +118,7 @@ Linkis Orchestrator的编排流程与很多SQL解析引擎(如Spark、Hive的S
<!--
#todo Orchestrator文档还没准备好!!
-关于Orchestrator的编排详细介绍,请参考:[Orchestrator架构设计](architecture/orchestrator/orchestrator_architecture_doc.md)
+关于Orchestrator的编排详细介绍,请参考:[Orchestrator架构设计]()
-->
经过了Linkis
Orchestrator的解析编排后,用户的计算任务已经转换成了一颗可被执行的Physical树。Orchestrator会将该Physical树提交给Orchestrator的Execution模块,进入最后的执行阶段。
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/deployment/quick_deploy.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/deployment/quick_deploy.md
index 52d7baf7c..b1c38858f 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/deployment/quick_deploy.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/deployment/quick_deploy.md
@@ -16,7 +16,7 @@ sidebar_position: 1
**如果您是首次接触并使用Linkis,您可以忽略该章节;如果您已经是 Linkis 的使用用户,安装或升级前建议先阅读:[Linkis1.0 与
Linkis0.X 的区别简述](architecture/difference_between_1.0_and_0.x.md)**。
-请注意:除了 Linkis1.0
安装包默认已经包含的:Python/Shell/Hive/Spark四个EngineConnPlugin以外,如果大家有需要,可以手动安装如 JDBC
引擎等类型的其他引擎,具体请参考
[EngineConnPlugin引擎插件安装文档](deployment/engine_conn_plugin_installation.md)。
+请注意:除了 Linkis1.0
安装包默认已经包含的:Python/Shell/Hive/Spark四个EngineConnPlugin以外,如果大家有需要,可以手动安装如 JDBC
引擎等类型的其他引擎,具体请参考
[EngineConnPlugin引擎插件安装文档](../deployment/engine_conn_plugin_installation)。
Linkis1.0.3 默认已适配的引擎列表如下:
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/development/linkis_compile_and_package.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/development/linkis_compile_and_package.md
index 606eae95d..3c8d801b0 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/development/linkis_compile_and_package.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/development/linkis_compile_and_package.md
@@ -107,7 +107,7 @@ __编译环境要求:__ 必须 **JDK8** 以上,**Oracle/Sun** 和 **OpenJDK
incubator-linkis-x.x.x/linkis-engineconn-plugins/engineconn-plugins/spark/target/linkis-engineplugin-spark-x.x.x.jar
```
-如何单独安装 Spark 引擎?请参考 [Linkis
引擎插件安装文档](deployment/engine_conn_plugin_installation.md)
+如何单独安装 Spark 引擎?请参考 [Linkis
引擎插件安装文档](../deployment/engine_conn_plugin_installation)
## 5. 如何修改Linkis的依赖的Hadoop、Hive、Spark版本
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/engine_usage/flink.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/engine_usage/flink.md
index dcc2f65a9..71ae433cd 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/engine_usage/flink.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/engine_usage/flink.md
@@ -64,7 +64,7 @@
https://github.com/WeBankFinTech/Linkis/wiki/EngineConnPlugin%E5%BC%95%E6%93%8E%
Linkis1.0是通过标签来进行的,所以需要在我们数据库中插入数据,插入的方式如下文所示。
-[EngineConnPlugin引擎插件安装](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin引擎插件安装](../deployment/engine_conn_plugin_installation)
## 3.Flink引擎的使用
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/engine_usage/hive.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/engine_usage/hive.md
index a110e8d02..0303af038 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/engine_usage/hive.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/engine_usage/hive.md
@@ -35,13 +35,13 @@ on Tez,需要您按照此pr进行一下修改。
如果您已经编译完了您的hive引擎的插件已经编译完成,那么您需要将新的插件放置到指定的位置中才能加载,具体可以参考下面这篇文章
-[EngineConnPlugin引擎插件安装](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin引擎插件安装](../deployment/engine_conn_plugin_installation)
### 2.3 hive引擎的标签
Linkis1.0是通过标签来进行的,所以需要在我们数据库中插入数据,插入的方式如下文所示。
-[EngineConnPlugin引擎插件安装 > 2.2
管理台Configuration配置修改(可选)](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin引擎插件安装 > 2.2
管理台Configuration配置修改(可选)](../deployment/engine_conn_plugin_installation)
## 3.hive引擎的使用
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/engine_usage/spark.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/engine_usage/spark.md
index 92fd55438..15d85d429 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/engine_usage/spark.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3/engine_usage/spark.md
@@ -33,13 +33,13 @@ sidebar_position: 1
如果您已经编译完了您的spark引擎的插件已经编译完成,那么您需要将新的插件放置到指定的位置中才能加载,具体可以参考下面这篇文章
-[EngineConnPlugin引擎插件安装](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin引擎插件安装](../deployment/engine_conn_plugin_installation)
### 2.3 spark引擎的标签
Linkis1.0是通过标签来进行的,所以需要在我们数据库中插入数据,插入的方式如下文所示。
-[EngineConnPlugin引擎插件安装 > 2.2
管理台Configuration配置修改(可选)](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin引擎插件安装 > 2.2
管理台Configuration配置修改(可选)](../deployment/engine_conn_plugin_installation)
## 3.spark引擎的使用
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/api/linkis_task_operator.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/api/linkis_task_operator.md
index 5a791a487..02b749fd4 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/api/linkis_task_operator.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/api/linkis_task_operator.md
@@ -18,7 +18,7 @@
- data:返回具体的数据。
- message:返回请求的提示信息。如果status非0时,message返回的是错误信息,其中data有可能存在stack字段,返回具体的堆栈信息。
-更多关于 Linkis Restful 接口的规范,请参考:[Linkis Restful
接口规范](development/development_specification/api.md)
+更多关于 Linkis Restful 接口的规范,请参考:[Linkis Restful
接口规范](/community/development_specification/api)
### 1. 提交执行
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/api/login_api.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/api/login_api.md
index c5e7cd2f7..91c83b455 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/api/login_api.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/api/login_api.md
@@ -63,7 +63,7 @@ wds.linkis.ldap.proxy.baseDN=dc=webank,dc=com # 您的LDAP服务的配置
- data:返回具体的数据。
- message:返回请求的提示信息。如果status非0时,message返回的是错误信息,其中data有可能存在stack字段,返回具体的堆栈信息。
-更多关于 Linkis Restful 接口的规范,请参考:[Linkis Restful
接口规范](development/development_specification/api.md)
+更多关于 Linkis Restful 接口的规范,请参考:[Linkis Restful
接口规范](/community/development_specification/api)
### 4.1 登录
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/architecture/computation_governance_services/overview.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/architecture/computation_governance_services/overview.md
index 20f60b382..65ea36c02 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/architecture/computation_governance_services/overview.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/architecture/computation_governance_services/overview.md
@@ -49,7 +49,7 @@ Linkis1.0将优化Job的整体执行流程,从提交 —\> 准备 —\>
<!--
#todo Orchestrator文档还没准备好!!
- [进入Orchestrator架构设计](orchestrator/overview.md)
+ [进入Orchestrator架构设计]()
-->
### 3、LinkisManager
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/architecture/job_submission_preparation_and_execution_process.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/architecture/job_submission_preparation_and_execution_process.md
index 8b4f426d9..d0a8dcfc8 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/architecture/job_submission_preparation_and_execution_process.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/architecture/job_submission_preparation_and_execution_process.md
@@ -118,7 +118,7 @@ Linkis Orchestrator的编排流程与很多SQL解析引擎(如Spark、Hive的S
<!--
#todo Orchestrator文档还没准备好!!
-关于Orchestrator的编排详细介绍,请参考:[Orchestrator架构设计](architecture/orchestrator/orchestrator_architecture_doc.md)
+关于Orchestrator的编排详细介绍,请参考:[Orchestrator架构设计]()
-->
经过了Linkis
Orchestrator的解析编排后,用户的计算任务已经转换成了一颗可被执行的Physical树。Orchestrator会将该Physical树提交给Orchestrator的Execution模块,进入最后的执行阶段。
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/deployment/quick_deploy.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/deployment/quick_deploy.md
index 52d7baf7c..b1c38858f 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/deployment/quick_deploy.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/deployment/quick_deploy.md
@@ -16,7 +16,7 @@ sidebar_position: 1
**如果您是首次接触并使用Linkis,您可以忽略该章节;如果您已经是 Linkis 的使用用户,安装或升级前建议先阅读:[Linkis1.0 与
Linkis0.X 的区别简述](architecture/difference_between_1.0_and_0.x.md)**。
-请注意:除了 Linkis1.0
安装包默认已经包含的:Python/Shell/Hive/Spark四个EngineConnPlugin以外,如果大家有需要,可以手动安装如 JDBC
引擎等类型的其他引擎,具体请参考
[EngineConnPlugin引擎插件安装文档](deployment/engine_conn_plugin_installation.md)。
+请注意:除了 Linkis1.0
安装包默认已经包含的:Python/Shell/Hive/Spark四个EngineConnPlugin以外,如果大家有需要,可以手动安装如 JDBC
引擎等类型的其他引擎,具体请参考
[EngineConnPlugin引擎插件安装文档](../deployment/engine_conn_plugin_installation)。
Linkis1.0.3 默认已适配的引擎列表如下:
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/development/linkis_compile_and_package.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/development/linkis_compile_and_package.md
index 606eae95d..3c8d801b0 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/development/linkis_compile_and_package.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/development/linkis_compile_and_package.md
@@ -107,7 +107,7 @@ __编译环境要求:__ 必须 **JDK8** 以上,**Oracle/Sun** 和 **OpenJDK
incubator-linkis-x.x.x/linkis-engineconn-plugins/engineconn-plugins/spark/target/linkis-engineplugin-spark-x.x.x.jar
```
-如何单独安装 Spark 引擎?请参考 [Linkis
引擎插件安装文档](deployment/engine_conn_plugin_installation.md)
+如何单独安装 Spark 引擎?请参考 [Linkis
引擎插件安装文档](../deployment/engine_conn_plugin_installation)
## 5. 如何修改Linkis的依赖的Hadoop、Hive、Spark版本
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/engine_usage/flink.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/engine_usage/flink.md
index dcc2f65a9..71ae433cd 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/engine_usage/flink.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/engine_usage/flink.md
@@ -64,7 +64,7 @@
https://github.com/WeBankFinTech/Linkis/wiki/EngineConnPlugin%E5%BC%95%E6%93%8E%
Linkis1.0是通过标签来进行的,所以需要在我们数据库中插入数据,插入的方式如下文所示。
-[EngineConnPlugin引擎插件安装](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin引擎插件安装](../deployment/engine_conn_plugin_installation)
## 3.Flink引擎的使用
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/engine_usage/hive.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/engine_usage/hive.md
index a110e8d02..0303af038 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/engine_usage/hive.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/engine_usage/hive.md
@@ -35,13 +35,13 @@ on Tez,需要您按照此pr进行一下修改。
如果您已经编译完了您的hive引擎的插件已经编译完成,那么您需要将新的插件放置到指定的位置中才能加载,具体可以参考下面这篇文章
-[EngineConnPlugin引擎插件安装](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin引擎插件安装](../deployment/engine_conn_plugin_installation)
### 2.3 hive引擎的标签
Linkis1.0是通过标签来进行的,所以需要在我们数据库中插入数据,插入的方式如下文所示。
-[EngineConnPlugin引擎插件安装 > 2.2
管理台Configuration配置修改(可选)](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin引擎插件安装 > 2.2
管理台Configuration配置修改(可选)](../deployment/engine_conn_plugin_installation)
## 3.hive引擎的使用
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/engine_usage/spark.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/engine_usage/spark.md
index 92fd55438..15d85d429 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/engine_usage/spark.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/engine_usage/spark.md
@@ -33,13 +33,13 @@ sidebar_position: 1
如果您已经编译完了您的spark引擎的插件已经编译完成,那么您需要将新的插件放置到指定的位置中才能加载,具体可以参考下面这篇文章
-[EngineConnPlugin引擎插件安装](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin引擎插件安装](../deployment/engine_conn_plugin_installation)
### 2.3 spark引擎的标签
Linkis1.0是通过标签来进行的,所以需要在我们数据库中插入数据,插入的方式如下文所示。
-[EngineConnPlugin引擎插件安装 > 2.2
管理台Configuration配置修改(可选)](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin引擎插件安装 > 2.2
管理台Configuration配置修改(可选)](../deployment/engine_conn_plugin_installation)
## 3.spark引擎的使用
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/api/linkis_task_operator.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/api/linkis_task_operator.md
index 5a791a487..02b749fd4 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/api/linkis_task_operator.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/api/linkis_task_operator.md
@@ -18,7 +18,7 @@
- data:返回具体的数据。
- message:返回请求的提示信息。如果status非0时,message返回的是错误信息,其中data有可能存在stack字段,返回具体的堆栈信息。
-更多关于 Linkis Restful 接口的规范,请参考:[Linkis Restful
接口规范](development/development_specification/api.md)
+更多关于 Linkis Restful 接口的规范,请参考:[Linkis Restful
接口规范](/community/development_specification/api)
### 1. 提交执行
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/api/login_api.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/api/login_api.md
index c5e7cd2f7..91c83b455 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/api/login_api.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/api/login_api.md
@@ -63,7 +63,7 @@ wds.linkis.ldap.proxy.baseDN=dc=webank,dc=com # 您的LDAP服务的配置
- data:返回具体的数据。
- message:返回请求的提示信息。如果status非0时,message返回的是错误信息,其中data有可能存在stack字段,返回具体的堆栈信息。
-更多关于 Linkis Restful 接口的规范,请参考:[Linkis Restful
接口规范](development/development_specification/api.md)
+更多关于 Linkis Restful 接口的规范,请参考:[Linkis Restful
接口规范](/community/development_specification/api)
### 4.1 登录
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/architecture/computation_governance_services/overview.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/architecture/computation_governance_services/overview.md
index 20f60b382..65ea36c02 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/architecture/computation_governance_services/overview.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/architecture/computation_governance_services/overview.md
@@ -49,7 +49,7 @@ Linkis1.0将优化Job的整体执行流程,从提交 —\> 准备 —\>
<!--
#todo Orchestrator文档还没准备好!!
- [进入Orchestrator架构设计](orchestrator/overview.md)
+ [进入Orchestrator架构设计]()
-->
### 3、LinkisManager
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/architecture/job_submission_preparation_and_execution_process.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/architecture/job_submission_preparation_and_execution_process.md
index 8b4f426d9..d0a8dcfc8 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/architecture/job_submission_preparation_and_execution_process.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/architecture/job_submission_preparation_and_execution_process.md
@@ -118,7 +118,7 @@ Linkis Orchestrator的编排流程与很多SQL解析引擎(如Spark、Hive的S
<!--
#todo Orchestrator文档还没准备好!!
-关于Orchestrator的编排详细介绍,请参考:[Orchestrator架构设计](architecture/orchestrator/orchestrator_architecture_doc.md)
+关于Orchestrator的编排详细介绍,请参考:[Orchestrator架构设计]()
-->
经过了Linkis
Orchestrator的解析编排后,用户的计算任务已经转换成了一颗可被执行的Physical树。Orchestrator会将该Physical树提交给Orchestrator的Execution模块,进入最后的执行阶段。
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/deployment/quick_deploy.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/deployment/quick_deploy.md
index 0bfc77c99..364398722 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/deployment/quick_deploy.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/deployment/quick_deploy.md
@@ -16,7 +16,7 @@ sidebar_position: 1
**如果您已经是 Linkis 的使用用户,安装或升级前建议先阅读:[Linkis1.0 与 Linkis0.X
的区别简述](architecture/difference_between_1.0_and_0.x.md)**。
-请注意:除了 Linkis1.0
安装包默认已经包含的:Python/Shell/Hive/Spark四个EngineConnPlugin以外,如果大家有需要,可以手动安装如 JDBC
引擎等类型的其他引擎,具体请参考
[EngineConnPlugin引擎插件安装文档](deployment/engine_conn_plugin_installation.md)。
+请注意:除了 Linkis1.0
安装包默认已经包含的:Python/Shell/Hive/Spark四个EngineConnPlugin以外,如果大家有需要,可以手动安装如 JDBC
引擎等类型的其他引擎,具体请参考
[EngineConnPlugin引擎插件安装文档](/docs/latest/deployment/engine_conn_plugin_installation)。
Linkis1.0.3 默认已适配的引擎列表如下:
@@ -445,7 +445,7 @@ select * from linkis_cg_engine_conn_plugin_bml_resources
查看引擎的物料记录是否存在(如果有更新,查看更新时间是否正确)。
-如果不存在或则未更新,先尝试手动刷新物料资源(详细见[引擎物料资源刷新](docs/latest/deployment/engine_conn_plugin_installation#23-引擎刷新))。通过`log/linkis-cg-engineplugin.log`日志,查看物料失败的具体原因,很多时候可能是hdfs目录没有权限导致,检查gateway地址配置是否正确`conf/linkis.properties:wds.linkis.gateway.url`
+如果不存在或则未更新,先尝试手动刷新物料资源(详细见[引擎物料资源刷新](/docs/latest/deployment/engine_conn_plugin_installation#23-引擎刷新))。通过`log/linkis-cg-engineplugin.log`日志,查看物料失败的具体原因,很多时候可能是hdfs目录没有权限导致,检查gateway地址配置是否正确`conf/linkis.properties:wds.linkis.gateway.url`
引擎的物料资源默认上传到hdfs目录为 `/apps-data/${deployUser}/bml`
```shell script
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/development/linkis_compile_and_package.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/development/linkis_compile_and_package.md
index 3930c8e2e..2e8e50026 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/development/linkis_compile_and_package.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/development/linkis_compile_and_package.md
@@ -98,7 +98,7 @@ __编译环境要求:__ 必须 **JDK8** 以上,**Oracle/Sun** 和 **OpenJDK
incubator-linkis-x.x.x/linkis-engineconn-plugins/engineconn-plugins/spark/target/out/spark
```
-如何单独安装 Spark 引擎?请参考 [Linkis
引擎插件安装文档](deployment/engine_conn_plugin_installation.md)
+如何单独安装 Spark 引擎?请参考 [Linkis
引擎插件安装文档](../deployment/engine_conn_plugin_installation)
## 5. 如何修改Linkis的依赖的Hadoop、Hive、Spark版本
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/engine_usage/flink.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/engine_usage/flink.md
index 2528dc8fb..12979f32c 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/engine_usage/flink.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/engine_usage/flink.md
@@ -63,7 +63,7 @@
https://linkis.apache.org/zh-CN/docs/1.1.1/deployment/engine_conn_plugin_install
Linkis1.X是通过标签来进行的,所以需要在我们数据库中插入数据,插入的方式如下文所示。
-[EngineConnPlugin引擎插件安装](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin引擎插件安装](../deployment/engine_conn_plugin_installation)
## 3.Flink引擎的使用
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/engine_usage/hive.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/engine_usage/hive.md
index 9b59a9f8a..47116ad53 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/engine_usage/hive.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/engine_usage/hive.md
@@ -35,13 +35,13 @@ on Tez,需要您按照此pr进行一下修改。
如果您已经编译完了您的hive引擎的插件已经编译完成,那么您需要将新的插件放置到指定的位置中才能加载,具体可以参考下面这篇文章
-[EngineConnPlugin引擎插件安装](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin引擎插件安装](../deployment/engine_conn_plugin_installation)
### 2.3 hive引擎的标签
Linkis1.X是通过标签来进行的,所以需要在我们数据库中插入数据,插入的方式如下文所示。
-[EngineConnPlugin引擎插件安装 > 2.2
管理台Configuration配置修改(可选)](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin引擎插件安装 > 2.2
管理台Configuration配置修改(可选)](../deployment/engine_conn_plugin_installation)
## 3.hive引擎的使用
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/engine_usage/jdbc.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/engine_usage/jdbc.md
index f476e060c..fcd68a8f8 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/engine_usage/jdbc.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/engine_usage/jdbc.md
@@ -44,7 +44,7 @@ sh linkis-daemon.sh restart cg-engineplugin
Linkis1.X是通过标签来进行的,所以需要在我们数据库中插入数据,插入的方式如下文所示。
-[EngineConnPlugin引擎插件安装](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin引擎插件安装](../deployment/engine_conn_plugin_installation)
## 3.JDBC引擎的使用
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/engine_usage/openlookeng.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/engine_usage/openlookeng.md
index 744d2216e..2f43ccfd7 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/engine_usage/openlookeng.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/engine_usage/openlookeng.md
@@ -47,7 +47,7 @@ sh linkis-daemon.sh restart cg-engineplugin
Linkis1.X是通过标签来进行的,所以需要在我们数据库中插入数据,插入的方式如下文所示。
-[EngineConnPlugin引擎插件安装](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin引擎插件安装](../deployment/engine_conn_plugin_installation)
## 3 引擎的使用
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/engine_usage/spark.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/engine_usage/spark.md
index 5180f33fe..6cad8cfc7 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/engine_usage/spark.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/engine_usage/spark.md
@@ -33,13 +33,13 @@ sidebar_position: 1
如果您已经编译完了您的spark引擎的插件,那么您需要将新的插件放置到指定的位置中才能加载,具体可以参考下面这篇文章
-[EngineConnPlugin引擎插件安装](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin引擎插件安装](../deployment/engine_conn_plugin_installation)
### 2.3 spark引擎的标签
Linkis1.X是通过标签配置来区分引擎版本的,所以需要我们在数据库中插入数据,插入的方式如下文所示。
-[EngineConnPlugin引擎插件安装 > 2.2
管理台Configuration配置修改(可选)](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin引擎插件安装 > 2.2
管理台Configuration配置修改(可选)](../deployment/engine_conn_plugin_installation)
## 3.spark引擎的使用
diff --git a/src/pages/team/config.json b/src/pages/team/config.json
index a408ac83f..87d2e30e5 100644
--- a/src/pages/team/config.json
+++ b/src/pages/team/config.json
@@ -1,7 +1,7 @@
{
"zh-CN": {
"info": {
- "desc": "您可以通过上报bug/提交新功能或改进建议/提交补丁/文档编写/社区答疑/组织社区活动等方式参与到Apache
Linkis的贡献中,详细指引参见<a class=\"link\" href=\"community/how-to-contribute\"
rel=\"noopener noreferrer\">贡献者指南</a>。",
+ "desc": "您可以通过上报bug/提交新功能或改进建议/提交补丁/文档编写/社区答疑/组织社区活动等方式参与到Apache
Linkis的贡献中,详细指引参见<a class=\"link\" href=\"/community/how-to-contribute\"
rel=\"noopener noreferrer\">贡献者指南</a>。",
"tip": "(排名不分先后)"
},
"list": [
@@ -169,7 +169,7 @@
},
"en": {
"info": {
- "desc": "You can participate in the contribution of Apache Linkis by
reporting bugs/submitting new features or improvement suggestions/submitting
patches/ writing or refining documents/attending community Q&A/organizing
community activities, etc. For detailed instructions, please refer to <a
class=\"link\" href=\"community/how-to-contribute\" rel=\"noopener
noreferrer\">Contributor's Guide</a>.",
+ "desc": "You can participate in the contribution of Apache Linkis by
reporting bugs/submitting new features or improvement suggestions/submitting
patches/ writing or refining documents/attending community Q&A/organizing
community activities, etc. For detailed instructions, please refer to <a
class=\"link\" href=\"/community/how-to-contribute\" rel=\"noopener
noreferrer\">Contributor's Guide</a>.",
"tip": "(In no particular order)"
},
"list": [
diff --git a/versioned_docs/version-1.0.2/api/linkis_task_operator.md
b/versioned_docs/version-1.0.2/api/linkis_task_operator.md
index b0027c13a..62c7ad681 100644
--- a/versioned_docs/version-1.0.2/api/linkis_task_operator.md
+++ b/versioned_docs/version-1.0.2/api/linkis_task_operator.md
@@ -23,7 +23,7 @@ sidebar_position: 2
- data: return specific data.
- message: return the requested prompt message. If the status is not 0, the
message returned is an error message, and the data may have a stack field,
which returns specific stack information.
-For more information about the Linkis Restful interface specification, please
refer to: [Linkis Restful Interface
Specification](development/development_specification/api.md)
+For more information about the Linkis Restful interface specification, please
refer to: [Linkis Restful Interface
Specification](../../../community/development_specification/api)
### 1. Submit for Execution
diff --git a/versioned_docs/version-1.0.2/api/login_api.md
b/versioned_docs/version-1.0.2/api/login_api.md
index 66fee0faf..522fb1105 100644
--- a/versioned_docs/version-1.0.2/api/login_api.md
+++ b/versioned_docs/version-1.0.2/api/login_api.md
@@ -60,7 +60,7 @@ We provide the following login-related interfaces:
- data: return specific data.
- message: return the requested prompt message. If the status is not 0, the
message returns an error message, and the data may have a stack field, which
returns specific stack information.
-For more information about the Linkis Restful interface specification, please
refer to: [Linkis Restful Interface
Specification](development/development_specification/api.md)
+For more information about the Linkis Restful interface specification, please
refer to: [Linkis Restful Interface
Specification](/community/development_specification/api)
### 1). Login In
diff --git
a/versioned_docs/version-1.0.2/architecture/computation_governance_services/overview.md
b/versioned_docs/version-1.0.2/architecture/computation_governance_services/overview.md
index a5f19e88a..b40026117 100644
---
a/versioned_docs/version-1.0.2/architecture/computation_governance_services/overview.md
+++
b/versioned_docs/version-1.0.2/architecture/computation_governance_services/overview.md
@@ -33,7 +33,7 @@ Perform three stages to fully upgrade Linkis's Job execution
architecture, as sh
<!--
#todo Orchestrator documentation is not ready yet
-[Enter Orchestrator Architecture Design](orchestrator/overview.md)
+[Enter Orchestrator Architecture Design]()
-->
### 3. LinkisManager
diff --git
a/versioned_docs/version-1.0.2/architecture/job_submission_preparation_and_execution_process.md
b/versioned_docs/version-1.0.2/architecture/job_submission_preparation_and_execution_process.md
index 6b91f0b46..08e32acac 100644
---
a/versioned_docs/version-1.0.2/architecture/job_submission_preparation_and_execution_process.md
+++
b/versioned_docs/version-1.0.2/architecture/job_submission_preparation_and_execution_process.md
@@ -102,7 +102,7 @@ The orchestration process of Linkis Orchestrator is similar
to many SQL parsing
<!--
#todo Orchestrator documentation is not ready yet
-Please refer to [Orchestrator Architecture
Design](architecture/orchestrator/orchestrator_architecture_doc.md) for more
details.
+Please refer to [Orchestrator Architecture Design]() for more details.
-->
After the analysis and arrangement of Linkis Orchestrator, the computing task
has been transformed into a executable physical tree. Orchestrator will submit
the Physical tree to Orchestrator's Execution module and enter the final
execution stage.
diff --git
a/versioned_docs/version-1.0.2/development/linkis_compile_and_package.md
b/versioned_docs/version-1.0.2/development/linkis_compile_and_package.md
index eeadf3dd9..5b2d245a7 100644
--- a/versioned_docs/version-1.0.2/development/linkis_compile_and_package.md
+++ b/versioned_docs/version-1.0.2/development/linkis_compile_and_package.md
@@ -92,7 +92,7 @@ Get the installation package, there will be a compiled
package in the ->target d
incubator-linkis-x.x.x/linkis-engineconn-plugins/engineconn-plugins/spark/target/linkis-engineplugin-spark-x.x.x.jar
```
-How to install Spark engine separately? Please refer to [Linkis Engine Plugin
Installation Document](deployment/engine_conn_plugin_installation.md)
+How to install Spark engine separately? Please refer to [Linkis Engine Plugin
Installation Document](../deployment/engine_conn_plugin_installation)
## 5. How to modify the Hadoop, Hive, and Spark versions that Linkis depends on
diff --git a/versioned_docs/version-1.0.2/engine_usage/hive.md
b/versioned_docs/version-1.0.2/engine_usage/hive.md
index f63dc1920..f19ffb207 100644
--- a/versioned_docs/version-1.0.2/engine_usage/hive.md
+++ b/versioned_docs/version-1.0.2/engine_usage/hive.md
@@ -33,13 +33,13 @@ Other hive operating modes are similar, just copy the
corresponding dependencies
If you have already compiled your hive engineConn plug-in has been compiled,
then you need to put the new plug-in in the specified location to load, you can
refer to the following article for details
-[EngineConnPlugin Installation](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin Installation](../deployment/engine_conn_plugin_installation)
### 2.3 Linkis adds Hive console parameters(optional)
Linkis can configure the corresponding EngineConn parameters on the management
console. If your newly added EngineConn needs this feature, you can refer to
the following documents:
-[EngineConnPlugin Installation > 2.2 Configuration modification of management
console (optional)](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin Installation > 2.2 Configuration modification of management
console (optional)](../deployment/engine_conn_plugin_installation)
## 3. Use of hive engineConn
diff --git a/versioned_docs/version-1.0.2/engine_usage/spark.md
b/versioned_docs/version-1.0.2/engine_usage/spark.md
index e5ec91168..106ebce8f 100644
--- a/versioned_docs/version-1.0.2/engine_usage/spark.md
+++ b/versioned_docs/version-1.0.2/engine_usage/spark.md
@@ -34,13 +34,13 @@ In theory, Linkis1.0 supports all versions of spark2.x and
above. Spark 2.4.3 is
If you have already compiled your spark EngineConn plug-in has been compiled,
then you need to put the new plug-in to the specified location to load, you can
refer to the following article for details
-[EngineConnPlugin Installation](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin Installation](../deployment/engine_conn_plugin_installation)
### 2.3 tags of spark EngineConn
Linkis1.0 is done through tags, so we need to insert data in our database, the
way of inserting is shown below.
-[EngineConnPlugin Installation > 2.2 Configuration modification of management
console (optional)](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin Installation > 2.2 Configuration modification of management
console (optional)](../deployment/engine_conn_plugin_installation)
## 3. Use of spark EngineConn
diff --git a/versioned_docs/version-1.0.2/introduction.md
b/versioned_docs/version-1.0.2/introduction.md
index e4c5440f0..cea8bbab1 100644
--- a/versioned_docs/version-1.0.2/introduction.md
+++ b/versioned_docs/version-1.0.2/introduction.md
@@ -58,7 +58,7 @@ Please follow [Compile
Guide](development/linkis_compile_and_package.md) to comp
Please refer to [Deployment_Documents](deployment/quick_deploy.md) to do the
deployment.
# Examples and Guidance
-You can find examples and guidance for how to use and manage Linkis in
[User_Manual](user_guide/overview.md),
[Engine_Usage_Documents](engine_usage/overview.md) and
[API_Documents](dapi/overview.md).
+You can find examples and guidance for how to use and manage Linkis in
[User_Manual](user_guide/overview.md),
[Engine_Usage_Documents](engine_usage/overview.md) and
[API_Documents](../version-1.0.2/api/overview.md).
# Documentation
diff --git a/versioned_docs/version-1.0.3/api/linkis_task_operator.md
b/versioned_docs/version-1.0.3/api/linkis_task_operator.md
index b0027c13a..203fa570b 100644
--- a/versioned_docs/version-1.0.3/api/linkis_task_operator.md
+++ b/versioned_docs/version-1.0.3/api/linkis_task_operator.md
@@ -23,7 +23,7 @@ sidebar_position: 2
- data: return specific data.
- message: return the requested prompt message. If the status is not 0, the
message returned is an error message, and the data may have a stack field,
which returns specific stack information.
-For more information about the Linkis Restful interface specification, please
refer to: [Linkis Restful Interface
Specification](development/development_specification/api.md)
+For more information about the Linkis Restful interface specification, please
refer to: [Linkis Restful Interface
Specification](/community/development_specification/api)
### 1. Submit for Execution
diff --git a/versioned_docs/version-1.0.3/api/login_api.md
b/versioned_docs/version-1.0.3/api/login_api.md
index 66fee0faf..522fb1105 100644
--- a/versioned_docs/version-1.0.3/api/login_api.md
+++ b/versioned_docs/version-1.0.3/api/login_api.md
@@ -60,7 +60,7 @@ We provide the following login-related interfaces:
- data: return specific data.
- message: return the requested prompt message. If the status is not 0, the
message returns an error message, and the data may have a stack field, which
returns specific stack information.
-For more information about the Linkis Restful interface specification, please
refer to: [Linkis Restful Interface
Specification](development/development_specification/api.md)
+For more information about the Linkis Restful interface specification, please
refer to: [Linkis Restful Interface
Specification](/community/development_specification/api)
### 1). Login In
diff --git
a/versioned_docs/version-1.0.3/architecture/computation_governance_services/overview.md
b/versioned_docs/version-1.0.3/architecture/computation_governance_services/overview.md
index 062ccc91c..80c421ad7 100644
---
a/versioned_docs/version-1.0.3/architecture/computation_governance_services/overview.md
+++
b/versioned_docs/version-1.0.3/architecture/computation_governance_services/overview.md
@@ -33,7 +33,7 @@ Perform three stages to fully upgrade Linkis's Job execution
architecture, as sh
<!--
#todo Orchestrator documentation is not ready yet
-[Enter Orchestrator Architecture Design](orchestrator/overview.md)
+[Enter Orchestrator Architecture Design]()
-->
### 3. LinkisManager
diff --git
a/versioned_docs/version-1.0.3/architecture/job_submission_preparation_and_execution_process.md
b/versioned_docs/version-1.0.3/architecture/job_submission_preparation_and_execution_process.md
index cd7f483fa..dd6e8436c 100644
---
a/versioned_docs/version-1.0.3/architecture/job_submission_preparation_and_execution_process.md
+++
b/versioned_docs/version-1.0.3/architecture/job_submission_preparation_and_execution_process.md
@@ -102,7 +102,7 @@ The orchestration process of Linkis Orchestrator is similar
to many SQL parsing
<!--
#todo Orchestrator documentation is not ready yet
-Please refer to [Orchestrator Architecture
Design](architecture/orchestrator/orchestrator_architecture_doc.md) for more
details.
+Please refer to [Orchestrator Architecture Design]() for more details.
-->
After the analysis and arrangement of Linkis Orchestrator, the computing task
has been transformed into a executable physical tree. Orchestrator will submit
the Physical tree to Orchestrator's Execution module and enter the final
execution stage.
diff --git
a/versioned_docs/version-1.0.3/development/linkis_compile_and_package.md
b/versioned_docs/version-1.0.3/development/linkis_compile_and_package.md
index 6fd8678fd..f90db5fd2 100644
--- a/versioned_docs/version-1.0.3/development/linkis_compile_and_package.md
+++ b/versioned_docs/version-1.0.3/development/linkis_compile_and_package.md
@@ -108,7 +108,7 @@ Get the installation package, there will be a compiled
package in the ->target d
incubator-linkis-x.x.x/linkis-engineconn-plugins/engineconn-plugins/spark/target/linkis-engineplugin-spark-x.x.x.jar
```
-How to install Spark engine separately? Please refer to [Linkis Engine Plugin
Installation Document](deployment/engine_conn_plugin_installation.md)
+How to install Spark engine separately? Please refer to [Linkis Engine Plugin
Installation Document](../deployment/engine_conn_plugin_installation)
## 5. How to modify the Hadoop, Hive, and Spark versions that Linkis depends on
diff --git a/versioned_docs/version-1.0.3/engine_usage/flink.md
b/versioned_docs/version-1.0.3/engine_usage/flink.md
index ec566a99f..2e6124914 100644
--- a/versioned_docs/version-1.0.3/engine_usage/flink.md
+++ b/versioned_docs/version-1.0.3/engine_usage/flink.md
@@ -56,13 +56,13 @@ cd ${LINKIS_HOME}/sbin
sh linkis-daemon restart cg-engineplugin
```
A more detailed introduction to engineplugin can be found in the following
article.
-[EngineConnPlugin Installation](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin Installation](../deployment/engine_conn_plugin_installation)
### 2.3 Flink engine tags
Linkis1.0 is done through tags, so we need to insert data in our database, the
way of inserting is shown below.
-[EngineConnPlugin Installation > 2.2 Configuration modification of management
console (optional)](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin Installation > 2.2 Configuration modification of management
console (optional)](../deployment/engine_conn_plugin_installation)
## 3. The use of Flink engine
diff --git a/versioned_docs/version-1.0.3/engine_usage/hive.md
b/versioned_docs/version-1.0.3/engine_usage/hive.md
index c01297337..bc52ab484 100644
--- a/versioned_docs/version-1.0.3/engine_usage/hive.md
+++ b/versioned_docs/version-1.0.3/engine_usage/hive.md
@@ -32,13 +32,13 @@ Other hive operating modes are similar, just copy the
corresponding dependencies
If you have already compiled your hive engineConn plug-in has been compiled,
then you need to put the new plug-in in the specified location to load, you can
refer to the following article for details
-[EngineConnPlugin Installation](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin Installation](../deployment/engine_conn_plugin_installation)
### 2.3 Linkis adds Hive console parameters(optional)
Linkis can configure the corresponding EngineConn parameters on the management
console. If your newly added EngineConn needs this feature, you can refer to
the following documents:
-[EngineConnPlugin Installation > 2.2 Configuration modification of management
console (optional)](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin Installation > 2.2 Configuration modification of management
console (optional)](../deployment/engine_conn_plugin_installation)
## 3. Use of hive engineConn
diff --git a/versioned_docs/version-1.0.3/engine_usage/spark.md
b/versioned_docs/version-1.0.3/engine_usage/spark.md
index e5ec91168..106ebce8f 100644
--- a/versioned_docs/version-1.0.3/engine_usage/spark.md
+++ b/versioned_docs/version-1.0.3/engine_usage/spark.md
@@ -34,13 +34,13 @@ In theory, Linkis1.0 supports all versions of spark2.x and
above. Spark 2.4.3 is
If you have already compiled your spark EngineConn plug-in has been compiled,
then you need to put the new plug-in to the specified location to load, you can
refer to the following article for details
-[EngineConnPlugin Installation](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin Installation](../deployment/engine_conn_plugin_installation)
### 2.3 tags of spark EngineConn
Linkis1.0 is done through tags, so we need to insert data in our database, the
way of inserting is shown below.
-[EngineConnPlugin Installation > 2.2 Configuration modification of management
console (optional)](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin Installation > 2.2 Configuration modification of management
console (optional)](../deployment/engine_conn_plugin_installation)
## 3. Use of spark EngineConn
diff --git a/versioned_docs/version-1.0.3/introduction.md
b/versioned_docs/version-1.0.3/introduction.md
index 21d89c131..c2e9da9ab 100644
--- a/versioned_docs/version-1.0.3/introduction.md
+++ b/versioned_docs/version-1.0.3/introduction.md
@@ -58,7 +58,7 @@ Please follow [Compile
Guide](development/linkis_compile_and_package.md) to comp
Please refer to [Deployment_Documents](deployment/quick_deploy.md) to do the
deployment.
## Examples and Guidance
-You can find examples and guidance for how to use and manage Linkis in
[User_Manual](user_guide/overview.md),
[Engine_Usage_Documents](engine_usage/overview.md) and
[API_Documents](dapi/overview.md).
+You can find examples and guidance for how to use and manage Linkis in
[User_Manual](user_guide/overview.md),
[Engine_Usage_Documents](engine_usage/overview.md) and
[API_Documents](../../docs/api/overview.md).
## Documentation
diff --git a/versioned_docs/version-1.1.0/api/linkis_task_operator.md
b/versioned_docs/version-1.1.0/api/linkis_task_operator.md
index b0027c13a..203fa570b 100644
--- a/versioned_docs/version-1.1.0/api/linkis_task_operator.md
+++ b/versioned_docs/version-1.1.0/api/linkis_task_operator.md
@@ -23,7 +23,7 @@ sidebar_position: 2
- data: return specific data.
- message: return the requested prompt message. If the status is not 0, the
message returned is an error message, and the data may have a stack field,
which returns specific stack information.
-For more information about the Linkis Restful interface specification, please
refer to: [Linkis Restful Interface
Specification](development/development_specification/api.md)
+For more information about the Linkis Restful interface specification, please
refer to: [Linkis Restful Interface
Specification](/community/development_specification/api)
### 1. Submit for Execution
diff --git a/versioned_docs/version-1.1.0/api/login_api.md
b/versioned_docs/version-1.1.0/api/login_api.md
index 66fee0faf..522fb1105 100644
--- a/versioned_docs/version-1.1.0/api/login_api.md
+++ b/versioned_docs/version-1.1.0/api/login_api.md
@@ -60,7 +60,7 @@ We provide the following login-related interfaces:
- data: return specific data.
- message: return the requested prompt message. If the status is not 0, the
message returns an error message, and the data may have a stack field, which
returns specific stack information.
-For more information about the Linkis Restful interface specification, please
refer to: [Linkis Restful Interface
Specification](development/development_specification/api.md)
+For more information about the Linkis Restful interface specification, please
refer to: [Linkis Restful Interface
Specification](/community/development_specification/api)
### 1). Login In
diff --git
a/versioned_docs/version-1.1.0/architecture/computation_governance_services/overview.md
b/versioned_docs/version-1.1.0/architecture/computation_governance_services/overview.md
index 062ccc91c..80c421ad7 100644
---
a/versioned_docs/version-1.1.0/architecture/computation_governance_services/overview.md
+++
b/versioned_docs/version-1.1.0/architecture/computation_governance_services/overview.md
@@ -33,7 +33,7 @@ Perform three stages to fully upgrade Linkis's Job execution
architecture, as sh
<!--
#todo Orchestrator documentation is not ready yet
-[Enter Orchestrator Architecture Design](orchestrator/overview.md)
+[Enter Orchestrator Architecture Design]()
-->
### 3. LinkisManager
diff --git
a/versioned_docs/version-1.1.0/architecture/job_submission_preparation_and_execution_process.md
b/versioned_docs/version-1.1.0/architecture/job_submission_preparation_and_execution_process.md
index cd7f483fa..dd6e8436c 100644
---
a/versioned_docs/version-1.1.0/architecture/job_submission_preparation_and_execution_process.md
+++
b/versioned_docs/version-1.1.0/architecture/job_submission_preparation_and_execution_process.md
@@ -102,7 +102,7 @@ The orchestration process of Linkis Orchestrator is similar
to many SQL parsing
<!--
#todo Orchestrator documentation is not ready yet
-Please refer to [Orchestrator Architecture
Design](architecture/orchestrator/orchestrator_architecture_doc.md) for more
details.
+Please refer to [Orchestrator Architecture Design]() for more details.
-->
After the analysis and arrangement of Linkis Orchestrator, the computing task
has been transformed into a executable physical tree. Orchestrator will submit
the Physical tree to Orchestrator's Execution module and enter the final
execution stage.
diff --git
a/versioned_docs/version-1.1.0/development/linkis_compile_and_package.md
b/versioned_docs/version-1.1.0/development/linkis_compile_and_package.md
index 6fd8678fd..f90db5fd2 100644
--- a/versioned_docs/version-1.1.0/development/linkis_compile_and_package.md
+++ b/versioned_docs/version-1.1.0/development/linkis_compile_and_package.md
@@ -108,7 +108,7 @@ Get the installation package, there will be a compiled
package in the ->target d
incubator-linkis-x.x.x/linkis-engineconn-plugins/engineconn-plugins/spark/target/linkis-engineplugin-spark-x.x.x.jar
```
-How to install Spark engine separately? Please refer to [Linkis Engine Plugin
Installation Document](deployment/engine_conn_plugin_installation.md)
+How to install Spark engine separately? Please refer to [Linkis Engine Plugin
Installation Document](../deployment/engine_conn_plugin_installation)
## 5. How to modify the Hadoop, Hive, and Spark versions that Linkis depends on
diff --git a/versioned_docs/version-1.1.0/engine_usage/flink.md
b/versioned_docs/version-1.1.0/engine_usage/flink.md
index ec566a99f..2e6124914 100644
--- a/versioned_docs/version-1.1.0/engine_usage/flink.md
+++ b/versioned_docs/version-1.1.0/engine_usage/flink.md
@@ -56,13 +56,13 @@ cd ${LINKIS_HOME}/sbin
sh linkis-daemon restart cg-engineplugin
```
A more detailed introduction to engineplugin can be found in the following
article.
-[EngineConnPlugin Installation](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin Installation](../deployment/engine_conn_plugin_installation)
### 2.3 Flink engine tags
Linkis1.0 is done through tags, so we need to insert data in our database, the
way of inserting is shown below.
-[EngineConnPlugin Installation > 2.2 Configuration modification of management
console (optional)](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin Installation > 2.2 Configuration modification of management
console (optional)](../deployment/engine_conn_plugin_installation)
## 3. The use of Flink engine
diff --git a/versioned_docs/version-1.1.0/engine_usage/hive.md
b/versioned_docs/version-1.1.0/engine_usage/hive.md
index c01297337..bc52ab484 100644
--- a/versioned_docs/version-1.1.0/engine_usage/hive.md
+++ b/versioned_docs/version-1.1.0/engine_usage/hive.md
@@ -32,13 +32,13 @@ Other hive operating modes are similar, just copy the
corresponding dependencies
If you have already compiled your hive engineConn plug-in has been compiled,
then you need to put the new plug-in in the specified location to load, you can
refer to the following article for details
-[EngineConnPlugin Installation](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin Installation](../deployment/engine_conn_plugin_installation)
### 2.3 Linkis adds Hive console parameters(optional)
Linkis can configure the corresponding EngineConn parameters on the management
console. If your newly added EngineConn needs this feature, you can refer to
the following documents:
-[EngineConnPlugin Installation > 2.2 Configuration modification of management
console (optional)](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin Installation > 2.2 Configuration modification of management
console (optional)](../deployment/engine_conn_plugin_installation)
## 3. Use of hive engineConn
diff --git a/versioned_docs/version-1.1.0/engine_usage/spark.md
b/versioned_docs/version-1.1.0/engine_usage/spark.md
index e5ec91168..106ebce8f 100644
--- a/versioned_docs/version-1.1.0/engine_usage/spark.md
+++ b/versioned_docs/version-1.1.0/engine_usage/spark.md
@@ -34,13 +34,13 @@ In theory, Linkis1.0 supports all versions of spark2.x and
above. Spark 2.4.3 is
If you have already compiled your spark EngineConn plug-in has been compiled,
then you need to put the new plug-in to the specified location to load, you can
refer to the following article for details
-[EngineConnPlugin Installation](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin Installation](../deployment/engine_conn_plugin_installation)
### 2.3 tags of spark EngineConn
Linkis1.0 is done through tags, so we need to insert data in our database, the
way of inserting is shown below.
-[EngineConnPlugin Installation > 2.2 Configuration modification of management
console (optional)](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin Installation > 2.2 Configuration modification of management
console (optional)](../deployment/engine_conn_plugin_installation)
## 3. Use of spark EngineConn
diff --git a/versioned_docs/version-1.1.0/introduction.md
b/versioned_docs/version-1.1.0/introduction.md
index 39f726571..637f06dab 100644
--- a/versioned_docs/version-1.1.0/introduction.md
+++ b/versioned_docs/version-1.1.0/introduction.md
@@ -55,7 +55,7 @@ Please follow [Compile
Guide](development/linkis_compile_and_package.md) to comp
Please refer to [Deployment_Documents](deployment/quick_deploy.md) to do the
deployment.
## Examples and Guidance
-You can find examples and guidance for how to use and manage Linkis in
[User_Manual](user_guide/overview.md),
[Engine_Usage_Documents](engine_usage/overview.md) and
[API_Documents](dapi/overview.md).
+You can find examples and guidance for how to use and manage Linkis in
[User_Manual](user_guide/overview.md),
[Engine_Usage_Documents](engine_usage/overview.md) and
[API_Documents](api/overview.md).
## Documentation
diff --git a/versioned_docs/version-1.1.1/api/linkis_task_operator.md
b/versioned_docs/version-1.1.1/api/linkis_task_operator.md
index b0027c13a..203fa570b 100644
--- a/versioned_docs/version-1.1.1/api/linkis_task_operator.md
+++ b/versioned_docs/version-1.1.1/api/linkis_task_operator.md
@@ -23,7 +23,7 @@ sidebar_position: 2
- data: return specific data.
- message: return the requested prompt message. If the status is not 0, the
message returned is an error message, and the data may have a stack field,
which returns specific stack information.
-For more information about the Linkis Restful interface specification, please
refer to: [Linkis Restful Interface
Specification](development/development_specification/api.md)
+For more information about the Linkis Restful interface specification, please
refer to: [Linkis Restful Interface
Specification](/community/development_specification/api)
### 1. Submit for Execution
diff --git a/versioned_docs/version-1.1.1/api/login_api.md
b/versioned_docs/version-1.1.1/api/login_api.md
index 66fee0faf..522fb1105 100644
--- a/versioned_docs/version-1.1.1/api/login_api.md
+++ b/versioned_docs/version-1.1.1/api/login_api.md
@@ -60,7 +60,7 @@ We provide the following login-related interfaces:
- data: return specific data.
- message: return the requested prompt message. If the status is not 0, the
message returns an error message, and the data may have a stack field, which
returns specific stack information.
-For more information about the Linkis Restful interface specification, please
refer to: [Linkis Restful Interface
Specification](development/development_specification/api.md)
+For more information about the Linkis Restful interface specification, please
refer to: [Linkis Restful Interface
Specification](/community/development_specification/api)
### 1). Login In
diff --git
a/versioned_docs/version-1.1.1/architecture/computation_governance_services/overview.md
b/versioned_docs/version-1.1.1/architecture/computation_governance_services/overview.md
index 062ccc91c..80c421ad7 100644
---
a/versioned_docs/version-1.1.1/architecture/computation_governance_services/overview.md
+++
b/versioned_docs/version-1.1.1/architecture/computation_governance_services/overview.md
@@ -33,7 +33,7 @@ Perform three stages to fully upgrade Linkis's Job execution
architecture, as sh
<!--
#todo Orchestrator documentation is not ready yet
-[Enter Orchestrator Architecture Design](orchestrator/overview.md)
+[Enter Orchestrator Architecture Design]()
-->
### 3. LinkisManager
diff --git
a/versioned_docs/version-1.1.1/architecture/job_submission_preparation_and_execution_process.md
b/versioned_docs/version-1.1.1/architecture/job_submission_preparation_and_execution_process.md
index cd7f483fa..dd6e8436c 100644
---
a/versioned_docs/version-1.1.1/architecture/job_submission_preparation_and_execution_process.md
+++
b/versioned_docs/version-1.1.1/architecture/job_submission_preparation_and_execution_process.md
@@ -102,7 +102,7 @@ The orchestration process of Linkis Orchestrator is similar
to many SQL parsing
<!--
#todo Orchestrator documentation is not ready yet
-Please refer to [Orchestrator Architecture
Design](architecture/orchestrator/orchestrator_architecture_doc.md) for more
details.
+Please refer to [Orchestrator Architecture Design]() for more details.
-->
After the analysis and arrangement of Linkis Orchestrator, the computing task
has been transformed into a executable physical tree. Orchestrator will submit
the Physical tree to Orchestrator's Execution module and enter the final
execution stage.
diff --git
a/versioned_docs/version-1.1.1/development/linkis_compile_and_package.md
b/versioned_docs/version-1.1.1/development/linkis_compile_and_package.md
index 6fd8678fd..f90db5fd2 100644
--- a/versioned_docs/version-1.1.1/development/linkis_compile_and_package.md
+++ b/versioned_docs/version-1.1.1/development/linkis_compile_and_package.md
@@ -108,7 +108,7 @@ Get the installation package, there will be a compiled
package in the ->target d
incubator-linkis-x.x.x/linkis-engineconn-plugins/engineconn-plugins/spark/target/linkis-engineplugin-spark-x.x.x.jar
```
-How to install Spark engine separately? Please refer to [Linkis Engine Plugin
Installation Document](deployment/engine_conn_plugin_installation.md)
+How to install Spark engine separately? Please refer to [Linkis Engine Plugin
Installation Document](../deployment/engine_conn_plugin_installation)
## 5. How to modify the Hadoop, Hive, and Spark versions that Linkis depends on
diff --git a/versioned_docs/version-1.1.1/engine_usage/flink.md
b/versioned_docs/version-1.1.1/engine_usage/flink.md
index ec566a99f..2e6124914 100644
--- a/versioned_docs/version-1.1.1/engine_usage/flink.md
+++ b/versioned_docs/version-1.1.1/engine_usage/flink.md
@@ -56,13 +56,13 @@ cd ${LINKIS_HOME}/sbin
sh linkis-daemon restart cg-engineplugin
```
A more detailed introduction to engineplugin can be found in the following
article.
-[EngineConnPlugin Installation](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin Installation](../deployment/engine_conn_plugin_installation)
### 2.3 Flink engine tags
Linkis1.0 is done through tags, so we need to insert data in our database, the
way of inserting is shown below.
-[EngineConnPlugin Installation > 2.2 Configuration modification of management
console (optional)](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin Installation > 2.2 Configuration modification of management
console (optional)](../deployment/engine_conn_plugin_installation)
## 3. The use of Flink engine
diff --git a/versioned_docs/version-1.1.1/engine_usage/hive.md
b/versioned_docs/version-1.1.1/engine_usage/hive.md
index c01297337..bc52ab484 100644
--- a/versioned_docs/version-1.1.1/engine_usage/hive.md
+++ b/versioned_docs/version-1.1.1/engine_usage/hive.md
@@ -32,13 +32,13 @@ Other hive operating modes are similar, just copy the
corresponding dependencies
If you have already compiled your hive engineConn plug-in has been compiled,
then you need to put the new plug-in in the specified location to load, you can
refer to the following article for details
-[EngineConnPlugin Installation](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin Installation](../deployment/engine_conn_plugin_installation)
### 2.3 Linkis adds Hive console parameters(optional)
Linkis can configure the corresponding EngineConn parameters on the management
console. If your newly added EngineConn needs this feature, you can refer to
the following documents:
-[EngineConnPlugin Installation > 2.2 Configuration modification of management
console (optional)](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin Installation > 2.2 Configuration modification of management
console (optional)](../deployment/engine_conn_plugin_installation)
## 3. Use of hive engineConn
diff --git a/versioned_docs/version-1.1.1/engine_usage/openlookeng.md
b/versioned_docs/version-1.1.1/engine_usage/openlookeng.md
index aff47b340..80586220b 100644
--- a/versioned_docs/version-1.1.1/engine_usage/openlookeng.md
+++ b/versioned_docs/version-1.1.1/engine_usage/openlookeng.md
@@ -47,7 +47,7 @@ sh linkis-daemon restart cg-engineplugin
Linkis1.X is done through tags, so we need to insert data into our database,
and the insertion method is as follows.
-[EngineConnPlugin engine plugin
installation](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin engine plugin
installation](../deployment/engine_conn_plugin_installation)
## 3 The use of the engine
diff --git a/versioned_docs/version-1.1.1/engine_usage/spark.md
b/versioned_docs/version-1.1.1/engine_usage/spark.md
index e5ec91168..106ebce8f 100644
--- a/versioned_docs/version-1.1.1/engine_usage/spark.md
+++ b/versioned_docs/version-1.1.1/engine_usage/spark.md
@@ -34,13 +34,13 @@ In theory, Linkis1.0 supports all versions of spark2.x and
above. Spark 2.4.3 is
If you have already compiled your spark EngineConn plug-in has been compiled,
then you need to put the new plug-in to the specified location to load, you can
refer to the following article for details
-[EngineConnPlugin Installation](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin Installation](../deployment/engine_conn_plugin_installation)
### 2.3 tags of spark EngineConn
Linkis1.0 is done through tags, so we need to insert data in our database, the
way of inserting is shown below.
-[EngineConnPlugin Installation > 2.2 Configuration modification of management
console (optional)](deployment/engine_conn_plugin_installation.md)
+[EngineConnPlugin Installation > 2.2 Configuration modification of management
console (optional)](../deployment/engine_conn_plugin_installation)
## 3. Use of spark EngineConn
diff --git a/versioned_docs/version-1.1.1/introduction.md
b/versioned_docs/version-1.1.1/introduction.md
index 39f726571..637f06dab 100644
--- a/versioned_docs/version-1.1.1/introduction.md
+++ b/versioned_docs/version-1.1.1/introduction.md
@@ -55,7 +55,7 @@ Please follow [Compile
Guide](development/linkis_compile_and_package.md) to comp
Please refer to [Deployment_Documents](deployment/quick_deploy.md) to do the
deployment.
## Examples and Guidance
-You can find examples and guidance for how to use and manage Linkis in
[User_Manual](user_guide/overview.md),
[Engine_Usage_Documents](engine_usage/overview.md) and
[API_Documents](dapi/overview.md).
+You can find examples and guidance for how to use and manage Linkis in
[User_Manual](user_guide/overview.md),
[Engine_Usage_Documents](engine_usage/overview.md) and
[API_Documents](api/overview.md).
## Documentation
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]