twalthr commented on a change in pull request #18134:
URL: https://github.com/apache/flink/pull/18134#discussion_r773678619



##########
File path: flink-table/README.md
##########
@@ -0,0 +1,67 @@
+# Table API & SQL
+
+Apache Flink features two relational APIs - the Table API and SQL - for 
unified stream and batch processing. 
+The Table API is a language-integrated query API for Java, Scala, and Python 
that allows the composition of queries from relational operators such as 
selection, filter, and join in a very intuitive way.
+
+This documentation is intended for contributors of the table modules, and not 
for users. 
+If you want to use Table API & SQL, check out the 
[documentation](https://nightlies.apache.org/flink/flink-docs-master/docs/dev/table/overview/).
+
+## Modules
+
+### Common
+
+* `flink-table-common`:
+  * Type system definition
+  * UDF stack and built-in function definitions
+  * Internal data definitions
+  * Extension points for catalogs, formats, connectors
+  * Core APIs for extension points such as `Schema`
+  * Utilities to deal with type system, internal data types and printing
+  * When implementing a format, you usually need to depend only on this module

Review comment:
       Drop this line. Line 17 is enough explanation.

##########
File path: flink-table/flink-table-api-java-uber/pom.xml
##########
@@ -73,26 +57,6 @@ under the License.
                        <artifactId>flink-table-api-java-bridge</artifactId>
                        <version>${project.version}</version>
                </dependency>
-               <dependency>

Review comment:
       In theory a dep on `flink-table-api-java-bridge` would be enough? No 
`flink-table-api-bridge-base`, `flink-table-api-java`, or `flink-table-common`

##########
File path: flink-table/flink-table-runtime/pom.xml
##########
@@ -64,35 +62,38 @@ under the License.
                        <version>${project.version}</version>
                </dependency>
 
+               <!-- Flink dependencies -->
                <dependency>
                        <groupId>org.apache.flink</groupId>
                        <artifactId>flink-streaming-java</artifactId>
                        <version>${project.version}</version>
-                       <scope>provided</scope>
                </dependency>
 
                <dependency>
                        <groupId>org.apache.flink</groupId>
                        <artifactId>flink-cep</artifactId>
                        <version>${project.version}</version>
-                       <scope>provided</scope>
                </dependency>
 
+               <!-- Java compiler -->
                <dependency>
                        <groupId>org.codehaus.janino</groupId>
                        <artifactId>janino</artifactId>
-                       <version>${janino.version}</version>
+               </dependency>
+               <dependency>
+                       <groupId>org.codehaus.janino</groupId>
+                       <artifactId>commons-compiler</artifactId>
                </dependency>
 
-               <!-- Jackson -->
+               <!-- Jackson and JsonPath (used by calcite) -->

Review comment:
       now used by us? maybe remove `(used by calcite)`

##########
File path: flink-table/README.md
##########
@@ -0,0 +1,67 @@
+# Table API & SQL
+
+Apache Flink features two relational APIs - the Table API and SQL - for 
unified stream and batch processing. 
+The Table API is a language-integrated query API for Java, Scala, and Python 
that allows the composition of queries from relational operators such as 
selection, filter, and join in a very intuitive way.
+
+This documentation is intended for contributors of the table modules, and not 
for users. 
+If you want to use Table API & SQL, check out the 
[documentation](https://nightlies.apache.org/flink/flink-docs-master/docs/dev/table/overview/).
+
+## Modules
+
+### Common
+
+* `flink-table-common`:
+  * Type system definition
+  * UDF stack and built-in function definitions
+  * Internal data definitions
+  * Extension points for catalogs, formats, connectors
+  * Core APIs for extension points such as `Schema`
+  * Utilities to deal with type system, internal data types and printing
+  * When implementing a format, you usually need to depend only on this module
+
+### API
+
+* `flink-table-api-java`: 
+  * Java APIs for Table API and SQL
+  * Package `org.apache.flink.table.delegation`, which serves as entrypoint 
for all planner capabilities
+* `flink-table-api-scala`: Scala APIs for Table API and SQL
+* `flink-table-api-bridge-base`: Base classes for APIs to bridge between Table 
API and DataStream API
+* `flink-table-api-java-bridge`: 
+  * Java APIs to bridge between Table API and DataStream API
+  * When implementing a connector, you usually need to depend only on this 
module, in order to bridge your connector implementation developed with 
DataStream to Table API
+* `flink-table-api-scala-bridge`: Scala APIs to bridge between Table API and 
DataStream API
+* `flink-table-api-java-uber`: 
+  * Uber JAR bundling `flink-table-common` and all the Java API modules, 
including 3rd party dependencies.
+  * This module is intended to be used by the flink-dist, rather than from the 
users directly.
+
+### Runtime
+
+* `flink-table-code-splitter`: Tool to split generated Java code so that each 
method does not exceed the limit of 64KB.
+* `flink-table-runtime`:
+  * Operator implementations
+  * Built-in functions implementations
+  * Type system implementation, including readers/writers, converters and 
utilities
+  * Raw format
+  * The produced jar includes all the classes from this module and 
`flink-table-code-splitter`, including 3rd party dependencies
+
+### Parser and planner
+
+* `flink-sql-parser`: Default ANSI SQL parser implementation
+* `flink-sql-parser-hive`: Hive SQL dialect parser implementation
+* `flink-table-planner`:
+  * AST and Semantic tree
+  * SQL validator
+  * Planner and rules implementation

Review comment:
       `Optimizer and rules implementation`

##########
File path: flink-examples/flink-examples-table/pom.xml
##########
@@ -34,21 +34,27 @@ under the License.
        <packaging>jar</packaging>
 
        <dependencies>
-
-               <!-- Table ecosystem -->
+               <!-- Flink core -->
                <dependency>
                        <groupId>org.apache.flink</groupId>
-                       <artifactId>flink-table-api-java-bridge</artifactId>
+                       
<artifactId>flink-streaming-scala_${scala.binary.version}</artifactId>

Review comment:
       Following you arguments, we don't need `flink-streaming-scala` because 
`flink-table-api-scala-bridge` is enough?

##########
File path: flink-table/flink-table-planner-loader/pom.xml
##########
@@ -0,0 +1,173 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+  ~ Licensed to the Apache Software Foundation (ASF) under one
+  ~ or more contributor license agreements.  See the NOTICE file
+  ~ distributed with this work for additional information
+  ~ regarding copyright ownership.  The ASF licenses this file
+  ~ to you under the Apache License, Version 2.0 (the
+  ~ "License"); you may not use this file except in compliance
+  ~ with the License.  You may obtain a copy of the License at
+  ~
+  ~     http://www.apache.org/licenses/LICENSE-2.0
+  ~
+  ~ Unless required by applicable law or agreed to in writing, software
+  ~ distributed under the License is distributed on an "AS IS" BASIS,
+  ~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+  ~ See the License for the specific language governing permissions and
+  ~ limitations under the License.
+  -->
+
+<project xmlns="http://maven.apache.org/POM/4.0.0";
+                xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
+                xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd";>
+
+       <modelVersion>4.0.0</modelVersion>
+
+       <parent>
+               <groupId>org.apache.flink</groupId>
+               <artifactId>flink-table</artifactId>
+               <version>1.15-SNAPSHOT</version>
+               <relativePath>..</relativePath>
+       </parent>
+
+       <artifactId>flink-table-planner-loader</artifactId>
+       <name>Flink : Table : Planner Loader</name>
+       <packaging>jar</packaging>
+       <description>This module contains the mechanism for loading 
flink-table-planner through a separate classloader.
+               This allows arbitrary Scala versions in the classpath, hiding 
the Scala version used by the planner.</description>

Review comment:
       nit: use same block formatting as other module `pom.xml`s

##########
File path: flink-table/flink-sql-client/pom.xml
##########
@@ -53,43 +53,13 @@ under the License.
                        <version>${project.version}</version>
                </dependency>
 
-               <dependency>

Review comment:
       Following your arguments, we can remove `flink-core` as well as it is 
transitively pulled in via `flink-table-api-java-bridge` and `flink-clients`

##########
File path: flink-table/flink-table-planner/pom.xml
##########
@@ -49,38 +48,42 @@ under the License.
                        <groupId>org.codehaus.janino</groupId>
                        <artifactId>commons-compiler</artifactId>
                </dependency>
-               <!-- Used for code generation -->
                <dependency>
+                       <!-- Used for code generation -->
                        <groupId>org.codehaus.janino</groupId>
                        <artifactId>janino</artifactId>
                </dependency>
 
-               <!-- core dependencies -->
+               <!-- Table dependencies -->
+
+               <!-- Table API Java dependencies (not included in the uber) -->
 
                <dependency>
                        <groupId>org.apache.flink</groupId>
                        <artifactId>flink-table-common</artifactId>
                        <version>${project.version}</version>
                </dependency>
-
                <dependency>
                        <groupId>org.apache.flink</groupId>
                        <artifactId>flink-table-api-java</artifactId>
                        <version>${project.version}</version>
                </dependency>
-
                <dependency>
                        <groupId>org.apache.flink</groupId>
-                       <artifactId>flink-table-api-bridge-base</artifactId>
+                       <artifactId>flink-table-api-java-bridge</artifactId>

Review comment:
       drop `flink-table-common` and `flink-table-api-java` then?

##########
File path: flink-table/README.md
##########
@@ -0,0 +1,67 @@
+# Table API & SQL
+
+Apache Flink features two relational APIs - the Table API and SQL - for 
unified stream and batch processing. 
+The Table API is a language-integrated query API for Java, Scala, and Python 
that allows the composition of queries from relational operators such as 
selection, filter, and join in a very intuitive way.
+
+This documentation is intended for contributors of the table modules, and not 
for users. 
+If you want to use Table API & SQL, check out the 
[documentation](https://nightlies.apache.org/flink/flink-docs-master/docs/dev/table/overview/).
+
+## Modules
+
+### Common
+
+* `flink-table-common`:
+  * Type system definition
+  * UDF stack and built-in function definitions
+  * Internal data definitions
+  * Extension points for catalogs, formats, connectors
+  * Core APIs for extension points such as `Schema`
+  * Utilities to deal with type system, internal data types and printing
+  * When implementing a format, you usually need to depend only on this module
+
+### API
+
+* `flink-table-api-java`: 
+  * Java APIs for Table API and SQL
+  * Package `org.apache.flink.table.delegation`, which serves as entrypoint 
for all planner capabilities
+* `flink-table-api-scala`: Scala APIs for Table API and SQL
+* `flink-table-api-bridge-base`: Base classes for APIs to bridge between Table 
API and DataStream API
+* `flink-table-api-java-bridge`: 
+  * Java APIs to bridge between Table API and DataStream API
+  * When implementing a connector, you usually need to depend only on this 
module, in order to bridge your connector implementation developed with 
DataStream to Table API
+* `flink-table-api-scala-bridge`: Scala APIs to bridge between Table API and 
DataStream API
+* `flink-table-api-java-uber`: 
+  * Uber JAR bundling `flink-table-common` and all the Java API modules, 
including 3rd party dependencies.

Review comment:
       `all the Java API modules including the bridging to DataStream API`

##########
File path: flink-table/README.md
##########
@@ -0,0 +1,67 @@
+# Table API & SQL
+
+Apache Flink features two relational APIs - the Table API and SQL - for 
unified stream and batch processing. 
+The Table API is a language-integrated query API for Java, Scala, and Python 
that allows the composition of queries from relational operators such as 
selection, filter, and join in a very intuitive way.
+
+This documentation is intended for contributors of the table modules, and not 
for users. 
+If you want to use Table API & SQL, check out the 
[documentation](https://nightlies.apache.org/flink/flink-docs-master/docs/dev/table/overview/).
+
+## Modules
+
+### Common
+
+* `flink-table-common`:
+  * Type system definition
+  * UDF stack and built-in function definitions
+  * Internal data definitions
+  * Extension points for catalogs, formats, connectors
+  * Core APIs for extension points such as `Schema`
+  * Utilities to deal with type system, internal data types and printing
+  * When implementing a format, you usually need to depend only on this module
+
+### API
+
+* `flink-table-api-java`: 
+  * Java APIs for Table API and SQL
+  * Package `org.apache.flink.table.delegation`, which serves as entrypoint 
for all planner capabilities
+* `flink-table-api-scala`: Scala APIs for Table API and SQL
+* `flink-table-api-bridge-base`: Base classes for APIs to bridge between Table 
API and DataStream API
+* `flink-table-api-java-bridge`: 
+  * Java APIs to bridge between Table API and DataStream API
+  * When implementing a connector, you usually need to depend only on this 
module, in order to bridge your connector implementation developed with 
DataStream to Table API

Review comment:
       This line is not entirely true. The long-term goal is to depend only on 
`table-common`. For new sources this is already the case.
   
   ```
   Connectors that are developed using DataStream API, usually need to depend 
only on this module.
   ```

##########
File path: flink-table/flink-table-planner-loader/pom.xml
##########
@@ -0,0 +1,173 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+  ~ Licensed to the Apache Software Foundation (ASF) under one
+  ~ or more contributor license agreements.  See the NOTICE file
+  ~ distributed with this work for additional information
+  ~ regarding copyright ownership.  The ASF licenses this file
+  ~ to you under the Apache License, Version 2.0 (the
+  ~ "License"); you may not use this file except in compliance
+  ~ with the License.  You may obtain a copy of the License at
+  ~
+  ~     http://www.apache.org/licenses/LICENSE-2.0
+  ~
+  ~ Unless required by applicable law or agreed to in writing, software
+  ~ distributed under the License is distributed on an "AS IS" BASIS,
+  ~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+  ~ See the License for the specific language governing permissions and
+  ~ limitations under the License.
+  -->
+
+<project xmlns="http://maven.apache.org/POM/4.0.0";
+                xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
+                xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd";>
+
+       <modelVersion>4.0.0</modelVersion>
+
+       <parent>
+               <groupId>org.apache.flink</groupId>
+               <artifactId>flink-table</artifactId>
+               <version>1.15-SNAPSHOT</version>
+               <relativePath>..</relativePath>
+       </parent>
+
+       <artifactId>flink-table-planner-loader</artifactId>
+       <name>Flink : Table : Planner Loader</name>
+       <packaging>jar</packaging>
+       <description>This module contains the mechanism for loading 
flink-table-planner through a separate classloader.
+               This allows arbitrary Scala versions in the classpath, hiding 
the Scala version used by the planner.</description>
+
+       <dependencies>
+               <dependency>

Review comment:
       can be dropped

##########
File path: flink-table/flink-table-planner/pom.xml
##########
@@ -105,23 +107,15 @@ under the License.
                        </exclusions>
                </dependency>
 
-               <dependency>
-                       <groupId>org.apache.flink</groupId>
-                       <artifactId>flink-table-runtime</artifactId>
-                       <version>${project.version}</version>
-               </dependency>
+               <!-- Table Runtime (not included in the uber) -->
 
                <dependency>
                        <groupId>org.apache.flink</groupId>
-                       
<artifactId>flink-scala_${scala.binary.version}</artifactId>
+                       <artifactId>flink-table-runtime</artifactId>

Review comment:
       `provided`?

##########
File path: 
flink-table/flink-table-planner-loader/src/main/java/org/apache/flink/table/planner/loader/PlannerModule.java
##########
@@ -0,0 +1,143 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.loader;
+
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.configuration.ConfigurationUtils;
+import org.apache.flink.configuration.CoreOptions;
+import org.apache.flink.core.classloading.ComponentClassLoader;
+import org.apache.flink.core.classloading.SubmoduleClassLoader;
+import org.apache.flink.runtime.rpc.RpcSystem;
+import org.apache.flink.table.api.TableException;
+import org.apache.flink.table.delegation.ExecutorFactory;
+import org.apache.flink.table.delegation.ExpressionParserFactory;
+import org.apache.flink.table.delegation.PlannerFactory;
+import org.apache.flink.table.factories.FactoryUtil;
+import org.apache.flink.util.IOUtils;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.URL;
+import java.nio.file.Files;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.util.Arrays;
+import java.util.UUID;
+import java.util.stream.Stream;
+
+/**
+ * Module holder that loads the flink-table-planner module in a separate 
classpath.
+ *
+ * <p>This loader expects the flink-table-planner jar to be accessible via 
{@link
+ * ClassLoader#getResource(String)}. It will extract the jar into a temporary 
directory and create a
+ * new {@link SubmoduleClassLoader} to load the various planner factories from 
that jar.
+ */
+class PlannerModule {
+
+    /**
+     * The name of the table planner dependency jar, bundled with 
flink-table-planner-loader module
+     * artifact.
+     */
+    static final String FLINK_TABLE_PLANNER_FAT_JAR = 
"flink-table-planner.jar";
+
+    static final String HINT_USAGE =
+            "mvn clean package -pl 
flink-table/flink-table-planner,flink-table/flink-table-planner-loader 
-DskipTests";
+
+    private final ClassLoader submoduleClassLoader;
+
+    private PlannerModule() {
+        try {
+            final ClassLoader flinkClassLoader = 
RpcSystem.class.getClassLoader();

Review comment:
       `RpcSystem`?

##########
File path: flink-table/flink-table-planner/pom.xml
##########
@@ -356,103 +351,130 @@ under the License.
                        <plugin>
                                <groupId>org.apache.maven.plugins</groupId>
                                <artifactId>maven-shade-plugin</artifactId>
+                               <configuration>
+                                       <!-- Base config -->
+                                       
<createDependencyReducedPom>true</createDependencyReducedPom>
+                                       
<dependencyReducedPomLocation>${project.basedir}/target/dependency-reduced-pom.xml</dependencyReducedPomLocation>
+                                       <filters combine.children="append">
+                                               <filter>
+                                                       <artifact>*:*</artifact>
+                                                       <excludes>
+                                                               <!-- Excluded 
all these files for a clean flink-table-planner JAR -->
+                                                               
<exclude>org-apache-calcite-jdbc.properties</exclude>
+                                                               
<exclude>common.proto</exclude>
+                                                               
<exclude>requests.proto</exclude>
+                                                               
<exclude>responses.proto</exclude>
+                                                               
<exclude>codegen/**</exclude>
+                                                               
<exclude>META-INF/*.SF</exclude>
+                                                               
<exclude>META-INF/*.DSA</exclude>
+                                                               
<exclude>META-INF/*.RSA</exclude>
+                                                               
<exclude>META-INF/services/java.sql.Driver</exclude>
+                                                               
<exclude>META-INF/versions/11/module-info.class</exclude>
+                                                               <!-- 
com.ibm.icu:icu4j includes a LICENSE file in its root folder -->
+                                                               
<exclude>LICENSE</exclude>
+                                                       </excludes>
+                                               </filter>
+                                       </filters>
+                                       <artifactSet>
+                                               <includes 
combine.children="append">
+                                                       
<include>org.apache.calcite:*</include>
+                                                       
<include>org.apache.calcite.avatica:*</include>
+
+                                                       <!-- Calcite's 
dependencies -->
+                                                       
<include>com.esri.geometry:esri-geometry-api</include>
+                                                       
<include>com.google.guava:guava</include>
+                                                       
<include>com.google.guava:failureaccess</include>
+                                                       
<include>commons-codec:commons-codec</include>
+                                                       
<include>commons-io:commons-io</include>
+
+                                                       <!-- 
flink-table-planner dependencies -->
+                                                       
<include>org.apache.flink:flink-sql-parser</include>
+                                                       
<include>org.apache.flink:flink-sql-parser-hive</include>
+
+                                                       <!-- For legacy string 
expressions in Table API -->
+                                                       
<include>org.scala-lang.modules:scala-parser-combinators_${scala.binary.version}</include>
+
+                                                       <!-- ReflectionsUtil -->
+                                                       
<include>org.reflections:reflections</include>

Review comment:
       Isn't this only used for tests? What is it used for? JSON plan?

##########
File path: flink-table/flink-table-planner/pom.xml
##########
@@ -205,51 +205,64 @@ under the License.
                        </exclusions>
                </dependency>
 
-               <!-- Tools to unify display format for different languages -->
                <dependency>
-                       <groupId>com.ibm.icu</groupId>
-                       <artifactId>icu4j</artifactId>
-                       <version>67.1</version>
+                       <!-- For legacy string expressions in Table API -->
+                       <groupId>org.scala-lang.modules</groupId>
+                       
<artifactId>scala-parser-combinators_${scala.binary.version}</artifactId>
                </dependency>
 
-               <!-- For legacy string expressions in Table API -->
                <dependency>
-                       <groupId>org.scala-lang.modules</groupId>
-                       
<artifactId>scala-parser-combinators_${scala.binary.version}</artifactId>
+                       <!-- Utility to scan classpaths -->
+                       <groupId>org.reflections</groupId>
+                       <artifactId>reflections</artifactId>
+                       <version>0.9.10</version>
+                       <scope>compile</scope>
+                       <exclusions>
+                               <exclusion>
+                                       
<groupId>com.google.code.findbugs</groupId>
+                                       <artifactId>annotations</artifactId>
+                               </exclusion>
+                               <exclusion>
+                                       <groupId>com.google.guava</groupId>
+                                       <artifactId>guava</artifactId>
+                               </exclusion>
+                       </exclusions>
                </dependency>
 
-               <!-- test dependencies -->
+               <!-- Test dependencies -->
+
                <dependency>
                        <groupId>org.apache.flink</groupId>
                        <artifactId>flink-test-utils</artifactId>
                        <version>${project.version}</version>
                        <scope>test</scope>
                </dependency>
 
+               <!-- Table API Scala dependencies (included in the uber) -->

Review comment:
       Test dependencies included in the uber?

##########
File path: flink-table/flink-table-runtime/pom.xml
##########
@@ -64,35 +62,38 @@ under the License.
                        <version>${project.version}</version>
                </dependency>
 
+               <!-- Flink dependencies -->
                <dependency>
                        <groupId>org.apache.flink</groupId>
                        <artifactId>flink-streaming-java</artifactId>
                        <version>${project.version}</version>
-                       <scope>provided</scope>

Review comment:
       Why not `provided` anymore?

##########
File path: flink-table/README.md
##########
@@ -0,0 +1,62 @@
+# Table API & SQL
+
+Apache Flink features two relational APIs - the Table API and SQL - for 
unified stream and batch processing. 
+The Table API is a language-integrated query API for Java, Scala, and Python 
that allows the composition of queries from relational operators such as 
selection, filter, and join in a very intuitive way. 
+
+For more details on how to use it, check out the 
[documentation](https://nightlies.apache.org/flink/flink-docs-master/docs/dev/table/overview/).
+
+## Modules
+
+### Common
+
+* `flink-table-common`:
+  * Type system definition and UDF stack
+  * Internal data types definitions
+  * `Factory` definitions for catalogs, formats, connectors
+  * Other core APIs such as `Schema`
+  * Utilities to deal with type system, internal data types and printing
+  * When implementing a format, you usually need to depend only on this module
+
+### API
+
+* `flink-table-api-java`: 
+  * Java APIs for Table API and SQL
+  * Package `org.apache.flink.table.delegation`, which serves as entrypoint 
for all planner capabilities
+* `flink-table-api-scala`: Scala APIs for Table API and SQL
+* `flink-table-api-bridge-base`: Base classes for APIs to bridge between Table 
API and DataStream API
+* `flink-table-api-java-bridge`: 
+  * Java APIs to bridge between Table API and DataStream API
+  * When implementing a connector, you usually need to depend only on this 
module, in order to bridge your connector implementation developed with 
DataStream to Table API
+* `flink-table-api-scala-bridge`: Scala APIs to bridge between Table API and 
DataStream API
+* `flink-table-api-uber`: Uber JAR bundling `flink-table-common` and all the 
Java API modules, including 3rd party dependencies.
+
+### Runtime
+
+* `flink-table-code-splitter`: Tool to split generated Java code so that each 
method does not exceed the limit of 64KB.
+* `flink-table-runtime`:
+  * Operator implementations
+  * Built-in functions implementations
+  * Type system implementation, including readers/writers, converters and 
utilities
+  * Raw format
+  * The produced jar includes all the classes from this module and 
`flink-table-code-splitter`, including 3rd party dependencies
+
+### Parser and planner
+
+* `flink-sql-parser`: Default ANSI SQL parser implementation
+* `flink-sql-parser-hive`: Hive SQL dialect parser implementation
+* `flink-table-planner`:
+  * AST and Semantic tree
+  * SQL validator
+  * Planner and rules implementation
+  * Two jars are produced: one doesn't have any classifier and bundles all the 
classes from this module together with the two parsers, including 3rd party 
dependencies, while the other jar, classified as `loader-bundle`, extends the 
first jar including scala dependencies.
+* `flink-table-planner-loader`: Loader for `flink-table-planner` that loads 
the planner in a separate classpath, isolating the Scala version used to 
compile the planner.
+
+### SQL client
+
+* `flink-sql-client`: CLI tool to submit queries to a Flink cluster
+
+### Notes
+
+No module except `flink-table-planner` should depend on `flink-table-runtime` 
in production classpath, 

Review comment:
       But it needs to be added as a dependency. Otherwise you cannot run it in 
your IDE. Many people have problems with setting up Maven correctly. We should 
rephrase this to emphasize: `provided` is fine

##########
File path: 
flink-table/flink-table-planner-loader/src/main/java/org/apache/flink/table/planner/loader/PlannerModule.java
##########
@@ -0,0 +1,143 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.loader;
+
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.configuration.ConfigurationUtils;
+import org.apache.flink.configuration.CoreOptions;
+import org.apache.flink.core.classloading.ComponentClassLoader;
+import org.apache.flink.core.classloading.SubmoduleClassLoader;
+import org.apache.flink.runtime.rpc.RpcSystem;
+import org.apache.flink.table.api.TableException;
+import org.apache.flink.table.delegation.ExecutorFactory;
+import org.apache.flink.table.delegation.ExpressionParserFactory;
+import org.apache.flink.table.delegation.PlannerFactory;
+import org.apache.flink.table.factories.FactoryUtil;
+import org.apache.flink.util.IOUtils;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.URL;
+import java.nio.file.Files;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.util.Arrays;
+import java.util.UUID;
+import java.util.stream.Stream;
+
+/**
+ * Module holder that loads the flink-table-planner module in a separate 
classpath.
+ *
+ * <p>This loader expects the flink-table-planner jar to be accessible via 
{@link
+ * ClassLoader#getResource(String)}. It will extract the jar into a temporary 
directory and create a
+ * new {@link SubmoduleClassLoader} to load the various planner factories from 
that jar.
+ */
+class PlannerModule {
+
+    /**
+     * The name of the table planner dependency jar, bundled with 
flink-table-planner-loader module
+     * artifact.
+     */
+    static final String FLINK_TABLE_PLANNER_FAT_JAR = 
"flink-table-planner.jar";
+
+    static final String HINT_USAGE =
+            "mvn clean package -pl 
flink-table/flink-table-planner,flink-table/flink-table-planner-loader 
-DskipTests";
+
+    private final ClassLoader submoduleClassLoader;
+
+    private PlannerModule() {
+        try {
+            final ClassLoader flinkClassLoader = 
RpcSystem.class.getClassLoader();
+
+            final Path tmpDirectory =
+                    Paths.get(ConfigurationUtils.parseTempDirectories(new 
Configuration())[0]);
+            Files.createDirectories(tmpDirectory);
+            final Path tempFile =
+                    Files.createFile(
+                            tmpDirectory.resolve(
+                                    "flink-table-planner_" + UUID.randomUUID() 
+ ".jar"));
+
+            final InputStream resourceStream =
+                    
flinkClassLoader.getResourceAsStream(FLINK_TABLE_PLANNER_FAT_JAR);
+            if (resourceStream == null) {
+                throw new TableException(

Review comment:
       I guess this will also be the exception if the planner has been removed 
from `/opt`. Maybe we should also adapt the exception to something more 
user-friendly? For the RPC system this was fine but for the planner this might 
happen more frequently.

##########
File path: flink-table/flink-table-runtime/pom.xml
##########
@@ -153,9 +155,13 @@ under the License.
                                                        <artifactSet>
                                                                <includes 
combine.children="append">
                                                                        
<include>com.jayway.jsonpath:json-path</include>
+

Review comment:
       remove empty line




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to