This is an automated email from the ASF dual-hosted git repository.

casion pushed a commit to branch dev
in repository https://gitbox.apache.org/repos/asf/linkis-website.git


The following commit(s) were added to refs/heads/dev by this push:
     new f860c02317 add introduction for hadoop and spark multi version (#674)
f860c02317 is described below

commit f860c0231768c0a8b091a6847d39e013c122827e
Author: GuoPhilipse <[email protected]>
AuthorDate: Fri Feb 17 14:47:06 2023 +0800

    add introduction for hadoop and spark multi version (#674)
    
    * update hadoop and spark docs
---
 docs/deployment/version-adaptation.md              | 63 +++++++++++++++++++++-
 .../current/deployment/version-adaptation.md       | 60 +++++++++++++++++++++
 2 files changed, 122 insertions(+), 1 deletion(-)

diff --git a/docs/deployment/version-adaptation.md 
b/docs/deployment/version-adaptation.md
index dfff66bbd2..0ab2dcf6e9 100644
--- a/docs/deployment/version-adaptation.md
+++ b/docs/deployment/version-adaptation.md
@@ -63,6 +63,7 @@ SET @OPENLOOKENG_LABEL="openlookeng-1.5.0";
 
 #### 5.1.1 The pom file of linkis
 
+For Linkis version < 1.3.2
 ```xml
 <hadoop.version>3.1.1</hadoop.version>
 <scala.version>2.12.10</scala.version>
@@ -74,10 +75,25 @@ SET @OPENLOOKENG_LABEL="openlookeng-1.5.0";
     <artifactId>hadoop-hdfs-client</artifactId>
     <version>${hadoop.version}</version>
 <dependency>
-```
 
+```
+For Linkis version >= 1.3.2, we only need to set `scala.version` and 
`scala.binary.version` if necessary
+```java
+<scala.version>2.12.10</scala.version>
+<scala.binary.version>2.12</scala.binary.version>
+```
+Because we can directly compile with hadoop-3.3 or hadoop-2.7 profile.
+Profile `hadoop-3.3` can be used for any hadoop3.x, default hadoop3.x version 
will be hadoop 3.3.1,
+Profile `hadoop-2.7` can be used for any hadoop2.x, default hadoop2.x version 
will be hadoop 2.7.2,
+other hadoop version can be specified by -Dhadoop.version=xxx
+```text
+mvn -N  install
+mvn clean install -Phadoop-3.3 -Dmaven.test.skip=true
+mvn clean install -Phadoop-3.3 -Dhadoop.version=3.1.1 -Dmaven.test.skip=true
+```
 #### 5.1.2  The pom file of linkis-hadoop-common
 
+For Linkis version < 1.3.2
 ```xml
 <!-- Notice here <version>${hadoop.version}</version> , adjust according to 
whether you have encountered errors --> 
 <dependency>
@@ -87,6 +103,8 @@ SET @OPENLOOKENG_LABEL="openlookeng-1.5.0";
 </dependency>
 ```
 
+For Linkis version >= 1.3.2,`linkis-hadoop-common` module no need to change
+
 #### 5.1.3 The pom file of linkis-engineplugin-hive
 
 ```xml
@@ -95,10 +113,24 @@ SET @OPENLOOKENG_LABEL="openlookeng-1.5.0";
 
 #### 5.1.4 The pom file of linkis-engineplugin-spark
 
+For Linkis version < 1.3.2
 ```xml
 <spark.version>3.0.1</spark.version>
 ```
 
+For Linkis version >= 1.3.2
+```text
+We can directly compile with spark-3.2 or spark-2.4-hadoop-3.3 profile, if we 
need to used with hadoop3, then profile hadoop-3.3 will be needed.
+default spark3.x version will be spark 3.2.1. if we compile with spark-3.2 
then scala version will be 2.12.15 by default,
+so we do not need to set the scala version in Linkis project pom 
file(mentioned in 5.1.1).
+if spark2.x used with hadoop3, for compatibility reason, profile 
`spark-2.4-hadoop-3.3` need to be activated.
+```
+```text
+mvn -N  install
+mvn clean install -Pspark-3.2 -Phadoop-3.3 -Dmaven.test.skip=true
+mvn clean install -Pspark-2.4-hadoop-3.3 -Phadoop-3.3 -Dmaven.test.skip=true
+```
+
 #### 5.1.5 The pom file of flink-engineconn-flink
 
 ```xml
@@ -160,6 +192,7 @@ 
org.apache.linkis.governance.common.conf.GovernanceCommonConf  file adjustment
 
 #### 6.1.1 The pom file of linkis
 
+For Linkis version < 1.3.2
 ```xml
 <hadoop.version>3.1.1</hadoop.version>
 <json4s.version>3.2.11</json4s.version>
@@ -172,6 +205,20 @@ 
org.apache.linkis.governance.common.conf.GovernanceCommonConf  file adjustment
 <dependency>
 ```
 
+For Linkis version >= 1.3.2, we only need to set `json4s.version` if necessary
+```java
+<json4s.version>3.2.11</json4s.version>
+```
+Because we can directly compile with hadoop-3.3 or hadoop-2.7 profile.
+Profile `hadoop-3.3` can be used for any hadoop3.x, default hadoop3.x version 
will be hadoop 3.3.1,
+Profile `hadoop-2.7` can be used for any hadoop2.x, default hadoop2.x version 
will be hadoop 2.7.2,
+other hadoop version can be specified by -Dhadoop.version=xxx
+```text
+mvn -N  install
+mvn clean install -Phadoop-3.3 -Dmaven.test.skip=true
+mvn clean install -Phadoop-3.3 -Dhadoop.version=3.1.1 -Dmaven.test.skip=true
+```
+
 #### 6.1.2 The pom file of linkis-engineplugin-hive
 
 ```xml
@@ -180,10 +227,24 @@ 
org.apache.linkis.governance.common.conf.GovernanceCommonConf  file adjustment
 
 #### 6.1.3 The pom file of linkis-engineplugin-spark
 
+For Linkis version < 1.3.2
 ```xml
 <spark.version>2.3.2</spark.version>
 ```
 
+For Linkis version >= 1.3.2
+```text
+We can directly compile with spark-3.2 profile, if we need to use with 
hadoop3, then profile hadoop-3.3 will be needed.
+default spark3.x version will be spark 3.2.1. if we compile with spark-3.2 
then scala version will be 2.12.15 by default,
+so we do not need to set the scala version in Linkis project pom 
file(mentioned in 5.1.1).
+if spark2.x used with hadoop3, for compatibility reason, profile 
`spark-2.4-hadoop-3.3` need to be activated.
+```
+```text
+mvn -N  install
+mvn clean install -Pspark-3.2 -Phadoop-3.3 -Dmaven.test.skip=true
+mvn clean install -Pspark-2.4-hadoop-3.3 -Phadoop-3.3 -Dmaven.test.skip=true
+```
+
 #### 6.1.4 linkis-label-common adjustment
 
 org.apache.linkis.manager.label.conf.LabelCommonConfig file adjustment
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/deployment/version-adaptation.md
 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/deployment/version-adaptation.md
index 3300361866..a496e5c511 100644
--- 
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/deployment/version-adaptation.md
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/deployment/version-adaptation.md
@@ -63,6 +63,7 @@ SET @OPENLOOKENG_LABEL="openlookeng-1.5.0";
 
 #### 5.1.1 linkis的pom文件
 
+Linkis版本小于`1.3.2`时
 ```java
 <hadoop.version>3.1.1</hadoop.version>
 <scala.version>2.12.10</scala.version>
@@ -76,8 +77,24 @@ SET @OPENLOOKENG_LABEL="openlookeng-1.5.0";
 <dependency>
 ```
 
+当Linkis版本大于等于`1.3.2`时, 只需要设置 `scala.version` and `scala.binary.version`
+```java
+<scala.version>2.12.10</scala.version>
+<scala.binary.version>2.12</scala.binary.version>
+```
+因为我们可以直接使用`hadoop-3.3` or `hadoop-2.7` profile来编译
+Profile `hadoop-3.3` 可以用于任意hadoop3.x, 默认hadoop3.x版本是3.3.1,
+Profile `hadoop-2.7` 可以用于任意hadoop2.x, 默认hadoop2.x版本是2.7.2,
+想要用其他版本可以编译时指定 -Dhadoop.version=xxx
+```text
+mvn -N  install
+mvn clean install -Phadoop-3.3 -Dmaven.test.skip=true
+mvn clean install -Phadoop-3.3 -Dhadoop.version=3.1.1 -Dmaven.test.skip=true
+```
+
 #### 5.1.2 linkis-hadoop-common的pom文件
 
+Linkis版本小于`1.3.2`时
 ```java
 <!-- 注意这里的 <version>${hadoop.version}</version> , 根据你有没有遇到错误来进行调整 --> 
 <dependency>
@@ -87,6 +104,8 @@ SET @OPENLOOKENG_LABEL="openlookeng-1.5.0";
 </dependency>
 ```
 
+当Linkis版本大于等于`1.3.2`时,`linkis-hadoop-common`模块不需要变更
+
 #### 5.1.3 linkis-engineplugin-hive的pom文件
 
 ```java
@@ -95,10 +114,23 @@ SET @OPENLOOKENG_LABEL="openlookeng-1.5.0";
 
 #### 5.1.4 linkis-engineplugin-spark的pom文件
 
+Linkis版本小于`1.3.2`时
 ```java
 <spark.version>3.0.1</spark.version>
 ```
 
+当Linkis版本大于等于`1.3.2`时
+```text
+我们可以直接编译spark-3.2 profile, 如果我们同时使用hadoop3, 那么我们还需要指定hadoop-3.3 profile.
+默认 spark3.x 版本时3.2.1. 如果我们使用spark-3.2 profile编译, 
scala版本默认是2.12.15,因此我们不需要在项目根目录设置scala版本了(5.1.1提到当)
+如果Linkis使用hadoop3编译,同时spark仍旧是2.x版本的话,由于spark兼容性问题需要激活profile 
`spark-2.4-hadoop-3.3`
+```
+```text
+mvn -N  install
+mvn clean install -Pspark-3.2 -Phadoop-3.3 -Dmaven.test.skip=true
+mvn clean install -Pspark-2.4-hadoop-3.3 -Phadoop-3.3 -Dmaven.test.skip=true
+```
+
 #### 5.1.5 flink-engineconn-flink的pom文件
 
 ```java
@@ -158,6 +190,7 @@ 
org.apache.linkis.governance.common.conf.GovernanceCommonConf 文件调整
 
 #### 6.1.1 linkis的pom文件
 
+Linkis版本小于`1.3.2`时
 ```java
 <hadoop.version>3.1.1</hadoop.version>
 <json4s.version>3.2.11</json4s.version>
@@ -170,6 +203,20 @@ 
org.apache.linkis.governance.common.conf.GovernanceCommonConf 文件调整
 <dependency>
 ```
 
+当Linkis版本大于等于`1.3.2`时, 只需要设置 `json4s.version`
+```java
+<json4s.version>3.2.11</json4s.version>
+```
+因为我们可以直接使用`hadoop-3.3` or `hadoop-2.7` profile来编译
+Profile `hadoop-3.3` 可以用于任意hadoop3.x, 默认hadoop3.x版本是3.3.1,
+Profile `hadoop-2.7` 可以用于任意hadoop2.x, 默认hadoop2.x版本是2.7.2,
+想要用其他版本可以编译时指定 -Dhadoop.version=xxx
+```text
+mvn -N  install
+mvn clean install -Phadoop-3.3 -Dmaven.test.skip=true
+mvn clean install -Phadoop-3.3 -Dhadoop.version=3.1.1 -Dmaven.test.skip=true
+```
+
 #### 6.1.2 linkis-engineplugin-hive的pom文件
 
 ```java
@@ -178,10 +225,23 @@ 
org.apache.linkis.governance.common.conf.GovernanceCommonConf 文件调整
 
 #### 6.1.3 linkis-engineplugin-spark的pom文件
 
+Linkis版本小于`1.3.2`时
 ```java
 <spark.version>2.3.2</spark.version>
 ```
 
+当Linkis版本大于等于`1.3.2`时
+```text
+我们可以直接编译spark-3.2 profile, 如果我们同时使用hadoop3, 那么我们还需要指定hadoop-3.3 profile.
+默认 spark3.x 版本时3.2.1. 如果我们使用spark-3.2 profile编译, 
scala版本默认是2.12.15,因此我们不需要在项目根目录设置scala版本了(5.1.1提到当)
+如果Linkis使用hadoop3编译,同时spark仍旧是2.x版本的话,由于spark兼容性问题需要激活profile 
`spark-2.4-hadoop-3.3`
+```
+```text
+mvn -N  install
+mvn clean install -Pspark-3.2 -Phadoop-3.3 -Dmaven.test.skip=true
+mvn clean install -Pspark-2.4-hadoop-3.3 -Phadoop-3.3 -Dmaven.test.skip=true
+```
+
 #### 6.1.4 linkis-label-common调整
 
 org.apache.linkis.manager.label.conf.LabelCommonConfig 文件调整


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to