GuoPhilipse commented on PR #4110:
URL: https://github.com/apache/linkis/pull/4110#issuecomment-1383102210
> > I am going to shade the hadoop-client separately, so it intends to solve
two hadoop-client version in spark or hive engine classpath and to solve lower
hive version(spark-hive force depencency) conflict with high hadoop version. do
you got better ideas for that ?
>
> i think, when use hadoop 3.3 should update the spark, and hive version,
which direct support hadoop 3.3.
seems this way is the easiest way to support different hadoop version.
I am wondering if some user like me
`hdfs cluster version is 3.3.1 `
`spark version is 2.4.3`
`yarn client is 2.7.7`
they may be rejected by linkis until they client finished upgrade.
@peacewong how do you think /
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]