rmetzger commented on a change in pull request #11983:
URL: https://github.com/apache/flink/pull/11983#discussion_r423002221
##########
File path: flink-end-to-end-tests/pom.xml
##########
@@ -255,6 +298,21 @@ under the License.
</execution>
</executions>
</plugin>
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-enforcer-plugin</artifactId>
Review comment:
But I don't know if it makes sense to mess with Hadoop's transitive
dependency tree.
In our tests, we want to ensure that Flink works with certain vanilla Hadoop
versions.
If we start hand-crafting Hadoop's dependencies towards convergence, we
won't ensure that Flink works with those versions -- we ensure it works with
our version.
If the maven-enforcer-plugin would allow us to control the convergence check
more fine-grained, I would not be opposed to it, as we need to ensure that some
dependencyManagement, exclusion etc. from us is affecting vanilla Hadoop's
dependency tree.
A second problem is that the exclusions might differ between the Hadoop
versions we use for CI. For Hadoop 2.4.1 we have convergence, for 2.8.3 we
don't.
Given these thoughts, I believe we should disable the convergence check for
the tests, and rely on test failures for detecting severe issues with our
Hadoop integration. We just need to accept that the Hadoop project is wild-west
when it comes to dependencies.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]