twalthr commented on a change in pull request #18272:
URL: https://github.com/apache/flink/pull/18272#discussion_r778936206



##########
File path: 
flink-core/src/main/java/org/apache/flink/core/classloading/ComponentClassLoader.java
##########
@@ -86,22 +93,40 @@ public ComponentClassLoader(
     protected Class<?> loadClass(final String name, final boolean resolve)
             throws ClassNotFoundException {
         synchronized (getClassLoadingLock(name)) {
-            final Class<?> loadedClass = findLoadedClass(name);
-            if (loadedClass != null) {
-                return resolveIfNeeded(resolve, loadedClass);
-            }
-
-            if (isComponentFirstClass(name)) {
-                return loadClassFromComponentFirst(name, resolve);
+            try {
+                final Class<?> loadedClass = findLoadedClass(name);
+                if (loadedClass != null) {
+                    return resolveIfNeeded(resolve, loadedClass);
+                }
+
+                if (isComponentFirstClass(name)) {
+                    return loadClassFromComponentFirst(name, resolve);
+                }
+                if (isOwnerFirstClass(name)) {
+                    return loadClassFromOwnerFirst(name, resolve);
+                }
+
+                // making this behavior configurable 
(component-only/component-first/owner-first)
+                // would
+                // allow this class to subsume the FlinkUserCodeClassLoader 
(with an added exception
+                // handler)
+                return loadClassFromComponentOnly(name, resolve);
+            } catch (ClassNotFoundException e) {
+                // If we know the package of this class
+                Optional<String> foundAssociatedModule =
+                        
knownPackagePrefixesModuleAssociation.entrySet().stream()
+                                .filter(entry -> 
name.startsWith(entry.getKey()))
+                                .map(Map.Entry::getValue)
+                                .findFirst();
+                if (foundAssociatedModule.isPresent()) {
+                    throw new ClassNotFoundException(
+                            String.format(
+                                    "Class '%s' not found. Perhaps you forgot 
to add the module '%s' in the classpath?",

Review comment:
       `in the classpath` -> `to the classpath`?

##########
File path: 
flink-core/src/main/java/org/apache/flink/core/classloading/ComponentClassLoader.java
##########
@@ -86,22 +93,40 @@ public ComponentClassLoader(
     protected Class<?> loadClass(final String name, final boolean resolve)
             throws ClassNotFoundException {
         synchronized (getClassLoadingLock(name)) {
-            final Class<?> loadedClass = findLoadedClass(name);
-            if (loadedClass != null) {
-                return resolveIfNeeded(resolve, loadedClass);
-            }
-
-            if (isComponentFirstClass(name)) {
-                return loadClassFromComponentFirst(name, resolve);
+            try {
+                final Class<?> loadedClass = findLoadedClass(name);
+                if (loadedClass != null) {
+                    return resolveIfNeeded(resolve, loadedClass);
+                }
+
+                if (isComponentFirstClass(name)) {
+                    return loadClassFromComponentFirst(name, resolve);
+                }
+                if (isOwnerFirstClass(name)) {
+                    return loadClassFromOwnerFirst(name, resolve);
+                }
+
+                // making this behavior configurable 
(component-only/component-first/owner-first)
+                // would

Review comment:
       fix comment formatting

##########
File path: flink-examples/flink-examples-table/pom.xml
##########
@@ -53,34 +53,42 @@ under the License.
                        <version>${project.version}</version>
                </dependency>
 
-               <!-- Table connectors and formats -->
+               <!-- The following two dependencies are not required to define 
a SQL job pipeline,
+               but only to execute it.
+
+               In particular, here we're forced to use 
flink-table-planner_${scala.binary.version} instead of
+               flink-table-planner-loader, because otherwise you hit this bug 
https://youtrack.jetbrains.com/issue/IDEA-93855
+               when trying to run the examples from within Intellij IDEA.
+
+               In a real environment, you need these dependencies only at test 
scope, to execute the SQL job pipelines test.
+                -->
                <dependency>
                        <groupId>org.apache.flink</groupId>
-                       <artifactId>flink-connector-files</artifactId>
+                       <artifactId>flink-table-runtime</artifactId>
                        <version>${project.version}</version>
                </dependency>
                <dependency>
                        <groupId>org.apache.flink</groupId>
-                       <artifactId>flink-csv</artifactId>
+                       
<artifactId>flink-table-planner_${scala.binary.version}</artifactId>

Review comment:
       Provided? Or is it somehow excluded from the JAR files we generate for 
the dist.

##########
File path: flink-examples/flink-examples-table/pom.xml
##########
@@ -53,34 +53,42 @@ under the License.
                        <version>${project.version}</version>
                </dependency>
 

Review comment:
       remove empty line

##########
File path: flink-examples/flink-examples-table/pom.xml
##########
@@ -53,34 +53,42 @@ under the License.
                        <version>${project.version}</version>
                </dependency>
 
-               <!-- Table connectors and formats -->
+               <!-- The following two dependencies are not required to define 
a SQL job pipeline,
+               but only to execute it.
+
+               In particular, here we're forced to use 
flink-table-planner_${scala.binary.version} instead of
+               flink-table-planner-loader, because otherwise you hit this bug 
https://youtrack.jetbrains.com/issue/IDEA-93855
+               when trying to run the examples from within Intellij IDEA.
+
+               In a real environment, you need these dependencies only at test 
scope, to execute the SQL job pipelines test.

Review comment:
       `at test or provided scope (for tests or running it in an IDE).`

##########
File path: flink-table/flink-table-planner/pom.xml
##########
@@ -196,14 +189,15 @@ under the License.
                        <!-- For legacy string expressions in Table API -->
                        <groupId>org.scala-lang.modules</groupId>
                        
<artifactId>scala-parser-combinators_${scala.binary.version}</artifactId>
+                       <scope>provided</scope>

Review comment:
       Not sure if this is provided. It is not contained in Scala 2.12 by 
default.

##########
File path: flink-table/flink-table-planner/pom.xml
##########
@@ -196,14 +189,15 @@ under the License.
                        <!-- For legacy string expressions in Table API -->
                        <groupId>org.scala-lang.modules</groupId>
                        
<artifactId>scala-parser-combinators_${scala.binary.version}</artifactId>
+                       <scope>provided</scope>
                </dependency>
 
                <dependency>
                        <!-- Utility to scan classpaths -->
                        <groupId>org.reflections</groupId>
                        <artifactId>reflections</artifactId>
                        <version>0.9.10</version>
-                       <scope>compile</scope>
+                       <scope>provided</scope>

Review comment:
       Who pulls in this? not table-runtime.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to