[GitHub] [flink] zentol commented on a diff in pull request #21349: [FLINK-28203] Support Maven 3.3+

2023-05-11 Thread via GitHub


zentol commented on code in PR #21349:
URL: https://github.com/apache/flink/pull/21349#discussion_r1191632889


##
tools/ci/flink-ci-tools/src/main/java/org/apache/flink/tools/ci/optional/ShadeOptionalChecker.java:
##
@@ -0,0 +1,266 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.tools.ci.optional;
+
+import org.apache.flink.annotation.VisibleForTesting;
+import org.apache.flink.tools.ci.utils.dependency.DependencyParser;
+import org.apache.flink.tools.ci.utils.shade.ShadeParser;
+import org.apache.flink.tools.ci.utils.shared.Dependency;
+import org.apache.flink.tools.ci.utils.shared.DependencyTree;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.IOException;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.util.Collection;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+
+/**
+ * Verifies that all dependencies bundled with the shade-plugin are marked as 
optional in the pom.
+ * This ensures compatibility with later maven versions and in general 
simplifies dependency
+ * management as transitivity is no longer dependent on the shade-plugin.
+ *
+ * In Maven 3.3 the dependency tree was made immutable at runtime, and thus 
can no longer be
+ * changed by the shade plugin. The plugin would usually remove a dependency 
from the tree when it
+ * is being bundled (known as dependency reduction). While dependency 
reduction still works for the
+ * published poms (== what users consume) since it can still change the 
content of the final pom,
+ * while developing Flink it no longer works. This breaks plenty of things, 
since suddenly a bunch
+ * of dependencies are still visible to downstream modules that weren't before.
+ *
+ * To workaround this we mark all dependencies that we bundle as optional; 
this makes them
+ * non-transitive. To a downstream module, behavior-wise a non-transitive 
dependency is identical to
+ * a removed dependency.
+ *
+ * This checker analyzes the bundled dependencies (based on the 
shade-plugin output) and the set
+ * of dependencies (based on the dependency plugin) to detect cases where a 
dependency is not marked
+ * as optional as it should.
+ *
+ * The enforced rule is rather simple: Any dependency that is bundled, or 
any of its parents,
+ * must show up as optional in the dependency tree. The parent clause is 
required to cover cases
+ * where a module has 2 paths to a bundled dependency. If a module depends on 
A1/A2, each depending
+ * on B, with A1 and B being bundled, then even if A1 is marked as optional B 
is still shown as a
+ * non-optional dependency (because the non-optional A2 still needs it!).
+ */
+public class ShadeOptionalChecker {
+private static final Logger LOG = 
LoggerFactory.getLogger(ShadeOptionalChecker.class);
+
+public static void main(String[] args) throws IOException {
+if (args.length < 2) {
+System.out.println(
+"Usage: ShadeOptionalChecker  
");
+System.exit(1);
+}
+
+final Path shadeOutputPath = Paths.get(args[0]);
+final Path dependencyOutputPath = Paths.get(args[1]);
+
+final Map> bundledDependenciesByModule =
+ShadeParser.parseShadeOutput(shadeOutputPath);
+final Map dependenciesByModule =
+
DependencyParser.parseDependencyTreeOutput(dependencyOutputPath);
+
+final Map> violations =
+checkOptionalFlags(bundledDependenciesByModule, 
dependenciesByModule);
+
+if (!violations.isEmpty()) {
+LOG.error(
+"{} modules bundle in total {} dependencies without them 
being marked as optional in the pom.",
+violations.keySet().size(),
+violations.size());

Review Comment:
   This is annoying. I actually noticed that on CI but forgot to fix it :cry: 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: 

[GitHub] [flink] zentol commented on a diff in pull request #21349: [FLINK-28203] Support Maven 3.3+

2023-05-11 Thread via GitHub


zentol commented on code in PR #21349:
URL: https://github.com/apache/flink/pull/21349#discussion_r1191232405


##
tools/ci/flink-ci-tools/src/test/java/org/apache/flink/tools/ci/optional/ShadeOptionalCheckerTest.java:
##
@@ -0,0 +1,157 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.tools.ci.optional;
+
+import org.apache.flink.tools.ci.utils.shared.Dependency;
+import org.apache.flink.tools.ci.utils.shared.DependencyTree;
+
+import org.junit.jupiter.api.Test;
+
+import java.util.Collections;
+import java.util.Set;
+
+import static org.assertj.core.api.Assertions.assertThat;
+
+class ShadeOptionalCheckerTest {
+private static final String MODULE = "module";
+
+@Test
+void testNonBundledDependencyIsIgnored() {
+final Dependency dependency = createMandatoryDependency("a");
+final Set bundled = Collections.emptySet();
+final DependencyTree dependencyTree = new 
DependencyTree().addDirectDependency(dependency);
+
+final Set violations =
+ShadeOptionalChecker.checkOptionalFlags(MODULE, bundled, 
dependencyTree);
+
+assertThat(violations).isEmpty();
+}
+
+@Test
+void testNonBundledDependencyIsIgnoredEvenIfOthersAreBundled() {
+final Dependency dependencyA = createMandatoryDependency("a");
+final Dependency dependencyB = createMandatoryDependency("B");
+final Set bundled = Collections.singleton(dependencyB);
+final DependencyTree dependencyTree =
+new DependencyTree()
+.addDirectDependency(dependencyA)
+.addDirectDependency(dependencyB);
+
+final Set violations =
+ShadeOptionalChecker.checkOptionalFlags(MODULE, bundled, 
dependencyTree);
+
+assertThat(violations).containsExactly(dependencyB);
+}
+
+@Test
+void testDirectBundledOptionalDependencyIsAccepted() {
+final Dependency dependency = createOptionalDependency("a");
+final Set bundled = Collections.singleton(dependency);
+final DependencyTree dependencyTree = new 
DependencyTree().addDirectDependency(dependency);
+
+final Set violations =
+ShadeOptionalChecker.checkOptionalFlags(MODULE, bundled, 
dependencyTree);
+
+assertThat(violations).isEmpty();
+}
+
+@Test
+void testDirectBundledDependencyMustBeOptional() {
+final Dependency dependency = createMandatoryDependency("a");
+final Set bundled = Collections.singleton(dependency);
+final DependencyTree dependencyTree = new 
DependencyTree().addDirectDependency(dependency);
+
+final Set violations =
+ShadeOptionalChecker.checkOptionalFlags(MODULE, bundled, 
dependencyTree);
+
+assertThat(violations).containsExactly(dependency);
+}
+
+@Test
+void testTransitiveBundledOptionalDependencyIsAccepted() {
+final Dependency dependencyA = createMandatoryDependency("a");
+final Dependency dependencyB = createOptionalDependency("b");
+final Set bundled = Collections.singleton(dependencyB);
+final DependencyTree dependencyTree =
+new DependencyTree()
+.addDirectDependency(dependencyA)
+.addTransitiveDependencyTo(dependencyB, dependencyA);
+
+final Set violations =
+ShadeOptionalChecker.checkOptionalFlags(MODULE, bundled, 
dependencyTree);
+
+assertThat(violations).isEmpty();
+}
+
+@Test
+void testTransitiveBundledDependencyMustBeOptional() {
+final Dependency dependencyA = createMandatoryDependency("a");
+final Dependency dependencyB = createMandatoryDependency("b");
+final Set bundled = Collections.singleton(dependencyB);
+final DependencyTree dependencyTree =
+new DependencyTree()
+.addDirectDependency(dependencyA)
+.addTransitiveDependencyTo(dependencyB, dependencyA);
+
+final Set violations =
+ShadeOptionalChecker.checkOptionalFlags(MODULE, bundled, 
dependencyTree);
+
+

[GitHub] [flink] zentol commented on a diff in pull request #21349: [FLINK-28203] Support Maven 3.3+

2023-05-09 Thread via GitHub


zentol commented on code in PR #21349:
URL: https://github.com/apache/flink/pull/21349#discussion_r1188557290


##
flink-python/pom.xml:
##


Review Comment:
   We don't _need_ to generally speaking. IIRC the s3 filesystem is one of 
those hyper-problematic modules with loads of dependencies being associated 
with test dependencies.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [flink] zentol commented on a diff in pull request #21349: [FLINK-28203] Support Maven 3.3+

2023-05-09 Thread via GitHub


zentol commented on code in PR #21349:
URL: https://github.com/apache/flink/pull/21349#discussion_r1188449141


##
flink-formats/flink-sql-parquet/pom.xml:
##
@@ -42,12 +42,14 @@ under the License.
org.apache.flink
flink-parquet
${project.version}
+   ${flink.markBundledAsOptional}

 

org.apache.parquet
parquet-avro
${flink.format.parquet.version}
+   ${flink.markBundledAsOptional}

Review Comment:
   according to the shade-plugin configuration it is.
   ```
   org.apache.parquet:parquet-avro
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [flink] zentol commented on a diff in pull request #21349: [FLINK-28203] Support Maven 3.3+

2023-05-09 Thread via GitHub


zentol commented on code in PR #21349:
URL: https://github.com/apache/flink/pull/21349#discussion_r1188442180


##
README.md:
##


Review Comment:
   The ide setup guide says that eclipse isn't supported:
   
`https://nightlies.apache.org/flink/flink-docs-release-1.17/docs/flinkdev/ide_setup/#eclipse`
   
   I think it's fine to assume that people use IntelliJ.
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [flink] zentol commented on a diff in pull request #21349: [FLINK-28203] Support Maven 3.3+

2023-05-08 Thread via GitHub


zentol commented on code in PR #21349:
URL: https://github.com/apache/flink/pull/21349#discussion_r1187563240


##
tools/ci/flink-ci-tools/src/main/java/org/apache/flink/tools/ci/optional/ShadeOptionalChecker.java:
##
@@ -0,0 +1,206 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.tools.ci.optional;
+
+import org.apache.flink.annotation.VisibleForTesting;
+import org.apache.flink.tools.ci.utils.dependency.DependencyParser;
+import org.apache.flink.tools.ci.utils.shade.ShadeParser;
+import org.apache.flink.tools.ci.utils.shared.Dependency;
+import org.apache.flink.tools.ci.utils.shared.DependencyTree;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.IOException;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.util.Collection;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+
+/**
+ * Verifies that all dependencies bundled with the shade-plugin are marked as 
optional in the pom.
+ * This ensures compatibility with later maven versions and in general 
simplifies dependency
+ * management as transitivity is no longer dependent on the shade-plugin.
+ */
+public class ShadeOptionalChecker {
+private static final Logger LOG = 
LoggerFactory.getLogger(ShadeOptionalChecker.class);
+
+public static void main(String[] args) throws IOException {
+if (args.length < 2) {
+System.out.println(
+"Usage: ShadeOptionalChecker  
");
+System.exit(1);
+}
+
+final Path shadeOutputPath = Paths.get(args[0]);
+final Path dependencyOutputPath = Paths.get(args[1]);
+
+final Map> bundledDependenciesByModule =
+ShadeParser.parseShadeOutput(shadeOutputPath);
+final Map dependenciesByModule =
+
DependencyParser.parseDependencyTreeOutput(dependencyOutputPath);
+
+final Map> violations =
+checkOptionalFlags(bundledDependenciesByModule, 
dependenciesByModule);
+
+if (!violations.isEmpty()) {
+LOG.error(
+"{} modules bundle in total {} dependencies without them 
being marked as optional in the pom.",
+violations.keySet().size(),
+violations.size());
+LOG.error(
+"\tIn order for shading to properly work within Flink we 
require all bundled dependencies to be marked as optional in the pom.");
+LOG.error(
+"\tFor verification purposes we require the dependency 
tree from the dependency-plugin to show the dependency as either:");
+LOG.error("\t\ta) an optional dependency,");
+LOG.error("\t\tb) a transitive dependency of another optional 
dependency.");
+LOG.error(
+"\tIn most cases adding 
'${flink.markBundledAsOptional}' to the bundled dependency 
is sufficient.");
+LOG.error(
+"\tThere are some edge cases where a transitive dependency 
might be associated with the \"wrong\" dependency in the tree, for example if a 
test dependency also requires it.");
+LOG.error(
+"\tIn such cases you need to adjust the poms so that the 
dependency shows up in the right spot. This may require adding an explicit 
dependency (Management) entry, excluding dependencies, or at times even 
reordering dependencies in the pom.");
+LOG.error(
+"\tSee the Dependencies page in the wiki for details: 
https://cwiki.apache.org/confluence/display/FLINK/Dependencies;);
+
+for (String moduleWithViolations : violations.keySet()) {
+final Collection dependencyViolations =
+violations.get(moduleWithViolations);
+LOG.error(
+"\tModule {} ({} violation{}):",
+moduleWithViolations,
+dependencyViolations.size(),
+dependencyViolations.size() == 1 ? "" : "s");
+for (Dependency dependencyViolation : 

[GitHub] [flink] zentol commented on a diff in pull request #21349: [FLINK-28203] Support Maven 3.3+

2023-05-08 Thread via GitHub


zentol commented on code in PR #21349:
URL: https://github.com/apache/flink/pull/21349#discussion_r1187556938


##
tools/ci/flink-ci-tools/src/main/java/org/apache/flink/tools/ci/optional/ShadeOptionalChecker.java:
##
@@ -0,0 +1,206 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.tools.ci.optional;
+
+import org.apache.flink.annotation.VisibleForTesting;
+import org.apache.flink.tools.ci.utils.dependency.DependencyParser;
+import org.apache.flink.tools.ci.utils.shade.ShadeParser;
+import org.apache.flink.tools.ci.utils.shared.Dependency;
+import org.apache.flink.tools.ci.utils.shared.DependencyTree;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.IOException;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.util.Collection;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+
+/**
+ * Verifies that all dependencies bundled with the shade-plugin are marked as 
optional in the pom.
+ * This ensures compatibility with later maven versions and in general 
simplifies dependency
+ * management as transitivity is no longer dependent on the shade-plugin.
+ */
+public class ShadeOptionalChecker {
+private static final Logger LOG = 
LoggerFactory.getLogger(ShadeOptionalChecker.class);
+
+public static void main(String[] args) throws IOException {
+if (args.length < 2) {
+System.out.println(
+"Usage: ShadeOptionalChecker  
");
+System.exit(1);
+}
+
+final Path shadeOutputPath = Paths.get(args[0]);
+final Path dependencyOutputPath = Paths.get(args[1]);
+
+final Map> bundledDependenciesByModule =
+ShadeParser.parseShadeOutput(shadeOutputPath);
+final Map dependenciesByModule =
+
DependencyParser.parseDependencyTreeOutput(dependencyOutputPath);
+
+final Map> violations =
+checkOptionalFlags(bundledDependenciesByModule, 
dependenciesByModule);
+
+if (!violations.isEmpty()) {
+LOG.error(
+"{} modules bundle in total {} dependencies without them 
being marked as optional in the pom.",
+violations.keySet().size(),
+violations.size());
+LOG.error(
+"\tIn order for shading to properly work within Flink we 
require all bundled dependencies to be marked as optional in the pom.");
+LOG.error(
+"\tFor verification purposes we require the dependency 
tree from the dependency-plugin to show the dependency as either:");
+LOG.error("\t\ta) an optional dependency,");
+LOG.error("\t\tb) a transitive dependency of another optional 
dependency.");
+LOG.error(
+"\tIn most cases adding 
'${flink.markBundledAsOptional}' to the bundled dependency 
is sufficient.");
+LOG.error(
+"\tThere are some edge cases where a transitive dependency 
might be associated with the \"wrong\" dependency in the tree, for example if a 
test dependency also requires it.");
+LOG.error(
+"\tIn such cases you need to adjust the poms so that the 
dependency shows up in the right spot. This may require adding an explicit 
dependency (Management) entry, excluding dependencies, or at times even 
reordering dependencies in the pom.");
+LOG.error(
+"\tSee the Dependencies page in the wiki for details: 
https://cwiki.apache.org/confluence/display/FLINK/Dependencies;);
+
+for (String moduleWithViolations : violations.keySet()) {
+final Collection dependencyViolations =
+violations.get(moduleWithViolations);
+LOG.error(
+"\tModule {} ({} violation{}):",
+moduleWithViolations,
+dependencyViolations.size(),
+dependencyViolations.size() == 1 ? "" : "s");
+for (Dependency dependencyViolation : 

[GitHub] [flink] zentol commented on a diff in pull request #21349: [FLINK-28203] Support Maven 3.3+

2023-05-05 Thread via GitHub


zentol commented on code in PR #21349:
URL: https://github.com/apache/flink/pull/21349#discussion_r1186110665


##
tools/ci/flink-ci-tools/src/main/java/org/apache/flink/tools/ci/optional/ShadeOptionalChecker.java:
##
@@ -0,0 +1,206 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.tools.ci.optional;
+
+import org.apache.flink.annotation.VisibleForTesting;
+import org.apache.flink.tools.ci.utils.dependency.DependencyParser;
+import org.apache.flink.tools.ci.utils.shade.ShadeParser;
+import org.apache.flink.tools.ci.utils.shared.Dependency;
+import org.apache.flink.tools.ci.utils.shared.DependencyTree;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.IOException;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.util.Collection;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+
+/**
+ * Verifies that all dependencies bundled with the shade-plugin are marked as 
optional in the pom.
+ * This ensures compatibility with later maven versions and in general 
simplifies dependency
+ * management as transitivity is no longer dependent on the shade-plugin.
+ */
+public class ShadeOptionalChecker {
+private static final Logger LOG = 
LoggerFactory.getLogger(ShadeOptionalChecker.class);
+
+public static void main(String[] args) throws IOException {
+if (args.length < 2) {
+System.out.println(
+"Usage: ShadeOptionalChecker  
");
+System.exit(1);
+}
+
+final Path shadeOutputPath = Paths.get(args[0]);
+final Path dependencyOutputPath = Paths.get(args[1]);
+
+final Map> bundledDependenciesByModule =
+ShadeParser.parseShadeOutput(shadeOutputPath);
+final Map dependenciesByModule =
+
DependencyParser.parseDependencyTreeOutput(dependencyOutputPath);
+
+final Map> violations =
+checkOptionalFlags(bundledDependenciesByModule, 
dependenciesByModule);
+
+if (!violations.isEmpty()) {
+LOG.error(
+"{} modules bundle in total {} dependencies without them 
being marked as optional in the pom.",
+violations.keySet().size(),
+violations.size());
+LOG.error(
+"\tIn order for shading to properly work within Flink we 
require all bundled dependencies to be marked as optional in the pom.");
+LOG.error(
+"\tFor verification purposes we require the dependency 
tree from the dependency-plugin to show the dependency as either:");
+LOG.error("\t\ta) an optional dependency,");
+LOG.error("\t\tb) a transitive dependency of another optional 
dependency.");
+LOG.error(
+"\tIn most cases adding 
'${flink.markBundledAsOptional}' to the bundled dependency 
is sufficient.");
+LOG.error(
+"\tThere are some edge cases where a transitive dependency 
might be associated with the \"wrong\" dependency in the tree, for example if a 
test dependency also requires it.");
+LOG.error(
+"\tIn such cases you need to adjust the poms so that the 
dependency shows up in the right spot. This may require adding an explicit 
dependency (Management) entry, excluding dependencies, or at times even 
reordering dependencies in the pom.");
+LOG.error(
+"\tSee the Dependencies page in the wiki for details: 
https://cwiki.apache.org/confluence/display/FLINK/Dependencies;);
+
+for (String moduleWithViolations : violations.keySet()) {
+final Collection dependencyViolations =
+violations.get(moduleWithViolations);
+LOG.error(
+"\tModule {} ({} violation{}):",
+moduleWithViolations,
+dependencyViolations.size(),
+dependencyViolations.size() == 1 ? "" : "s");
+for (Dependency dependencyViolation : 

[GitHub] [flink] zentol commented on a diff in pull request #21349: [FLINK-28203] Support Maven 3.3+

2023-05-05 Thread via GitHub


zentol commented on code in PR #21349:
URL: https://github.com/apache/flink/pull/21349#discussion_r1186107303


##
flink-connectors/flink-sql-connector-hive-3.1.3/pom.xml:
##
@@ -82,13 +84,15 @@ under the License.
org.antlr
antlr-runtime
3.5.2
+   ${flink.markBundledAsOptional}

 


org.apache.hadoop
hadoop-mapreduce-client-core
3.1.0
+   ${flink.markBundledAsOptional}

Review Comment:
   No you're correct. I will try to change the packaging to reduce the 
likelyhood of such errors.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [flink] zentol commented on a diff in pull request #21349: [FLINK-28203] Support Maven 3.3+

2023-05-05 Thread via GitHub


zentol commented on code in PR #21349:
URL: https://github.com/apache/flink/pull/21349#discussion_r1186106077


##
tools/ci/verify_bundled_optional.sh:
##
@@ -0,0 +1,48 @@
+#!/usr/bin/env bash
+
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+## Checks that all bundled dependencies are marked as optional in the poms
+MVN_CLEAN_COMPILE_OUT=$1
+CI_DIR=$2
+FLINK_ROOT=$3
+
+source "${CI_DIR}/maven-utils.sh"
+
+cd "$FLINK_ROOT" || exit
+
+dependency_plugin_output=${CI_DIR}/optional_dep.txt
+
+run_mvn dependency:tree -B >> "${dependency_plugin_output}"

Review Comment:
   I just can't remember whether `>` or `>>` is append/create :sweat: 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org