jackxu2011 commented on code in PR #4110:
URL: https://github.com/apache/linkis/pull/4110#discussion_r1092640648


##########
linkis-engineconn-plugins/spark/pom.xml:
##########
@@ -202,13 +207,103 @@
       <artifactId>linkis-rpc</artifactId>
       <version>${project.version}</version>
     </dependency>
-
+    <dependency>
+      <groupId>org.apache.linkis</groupId>
+      <artifactId>linkis-hadoop-hdfs-client-shade</artifactId>
+      <version>${project.version}</version>

Review Comment:
   
   
   
   
   > I have tried ,seems not as expectd. for now i have excluded hdfs from 
spark-hive态spark-core. and so on , if the `linkis-hadoop-hdfs-client-shade ` 
dependency moved to `spark-2.4-hadoop-3.3` profile, when we donot compile with 
hadoop-3.3 and spark-2.4-hadoop-3.3, the spark module will be lack of hdfs 
dependency, if we add linkis-hadoop-common to the spark module, the compile 
will be failed for `Unrecognized Hadoop major version numb`. if we do not 
exclude hdfs from the spark-core, spark-hive and so on. when we compile with 
hadoop-3.3 and spark-2.4-hadoop-3.3, the spark module output lib will have 
hadoop3 dependency,backing to our previous version problem.
   
   linkis-hadoop-common  will in the classpath when run spark ec, because the 
linkis-hadoop-common is in lib/linkis-common/ public-module



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to