Github user zentol commented on the issue:
https://github.com/apache/flink/pull/6116
dependency-convergence is now disabled by default and only checked on
travis, or manually using the `tools/check-dependency-convergency.sh` script.
---
Github user StephanEwen commented on the issue:
https://github.com/apache/flink/pull/6116
Is there a workaround for users to disable dependency convergence?
It is actually a problem that we don't control the convergence of some
dependency that is used with varying versions
Github user zentol commented on the issue:
https://github.com/apache/flink/pull/6116
I'm not aware of a mechanism that allows defining an adhoc dependency
convergence. Users will have to modify the pom of `flink-shaded-hadoop` and add
a convergence profile for the given version,
Github user pnowojski commented on the issue:
https://github.com/apache/flink/pull/6116
I think the problem here is not that some of your versions are conflicting
with flink, but that your dependencies are on their own conflicting. When I
check:
> mvn -Dhadoop.version=2.7.0
Github user yew1eb commented on the issue:
https://github.com/apache/flink/pull/6116
@pnowojski yes, some of dependencis in the hadoop version maintained by
our company have some coflicts with flink.
do you have any suggestions for me?
---
Github user zentol commented on the issue:
https://github.com/apache/flink/pull/6116
ah sorry, i didn't see `-xxx`.
---
Github user pnowojski commented on the issue:
https://github.com/apache/flink/pull/6116
Yes I know, but I didn't see those problematic dependencies in
> mvn -Dhadoop.version=2.7.0 dependency:tree -pl
flink-shaded-hadoop/flink-shaded-hadoop2
thus I why assumed problem is with
Github user zentol commented on the issue:
https://github.com/apache/flink/pull/6116
Well it's not really _their_ hadoop, we explicitly allow and intend users
to set different hadoop version for `flink-shaded-hadoop`. For the default
version in master we're hiding all dependencies,
Github user pnowojski commented on the issue:
https://github.com/apache/flink/pull/6116
Or can not you converge dependencies in your hadoop? It seems like it's
causing convergence errors with itself.
Disabling convergence checking would be a step back and asking ourself for
Github user zentol commented on the issue:
https://github.com/apache/flink/pull/6116
The actual issue here is that these dependencies aren't shaded.
---
Github user yew1eb commented on the issue:
https://github.com/apache/flink/pull/6116
when I build flink with our maintained hadoop version, the command like
this `mvn clean install -DskipTests -Dhadoop.version=2.7.0-xxx`, it will
`BUILD FAILURE`.
Some Enforcer rules have
Github user zentol commented on the issue:
https://github.com/apache/flink/pull/6116
Why make this change? I don't see a build failure; i guess this is just
another instance of FLINK-9091 in which case the PR is already subsumed by
#6102.
---
Github user bowenli86 commented on the issue:
https://github.com/apache/flink/pull/6116
Will it lower the possibility of detecting lib version conflicts of Flink's
dependencies?
---
Github user yew1eb commented on the issue:
https://github.com/apache/flink/pull/6116
CC @pnowojski
---
14 matches
Mail list logo