[
https://issues.apache.org/jira/browse/FLINK-31659?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17706741#comment-17706741
]
Martijn Visser commented on FLINK-31659:
----------------------------------------
[~luoyuxia] I'm referring to the fact that the Hive connector is added and
therefore loaded by Flink. That means that you're then "using" Hive (even if
you don't specify it in your SQL statement). But happy to see that you have
found how we can mitigate it :) Thanks for that
> java.lang.ClassNotFoundException:
> org.apache.flink.table.planner.delegation.DialectFactory when bundled Hive
> connector jar is in classpath
> ------------------------------------------------------------------------------------------------------------------------------------------
>
> Key: FLINK-31659
> URL: https://issues.apache.org/jira/browse/FLINK-31659
> Project: Flink
> Issue Type: Bug
> Components: Connectors / Hive
> Affects Versions: 1.17.0, 1.16.1, 1.15.4
> Reporter: Caizhi Weng
> Priority: Major
>
> Steps to reproduce this bug:
> 1. Download a fresh 1.16.1 Flink distribution.
> 2. Add both {{flink-sql-connector-hive-2.3.9_2.12-1.16.0.jar}} and
> {{flink-shaded-hadoop-2-uber-2.8.3-10.0.jar}} into {{lib}} directory.
> 3. Start a standalone cluster with {{bin/start-cluster.sh}}.
> 4. Start SQL client and run the following SQL.
> {code:sql}
> create table T (
> a int,
> b string
> ) with (
> 'connector' = 'filesystem',
> 'format' = 'csv',
> 'path' = '/tmp/gao.csv'
> );
> create table S (
> a int,
> b string
> ) with (
> 'connector' = 'print'
> );
> insert into S select * from T;
> {code}
> The following exception will occur.
> {code}
> org.apache.flink.table.client.gateway.SqlExecutionException: Failed to parse
> statement: insert into S select * from T;
> at
> org.apache.flink.table.client.gateway.local.LocalExecutor.parseStatement(LocalExecutor.java:174)
> ~[flink-sql-client-1.16.1.jar:1.16.1]
> at
> org.apache.flink.table.client.cli.SqlCommandParserImpl.parseCommand(SqlCommandParserImpl.java:45)
> ~[flink-sql-client-1.16.1.jar:1.16.1]
> at
> org.apache.flink.table.client.cli.SqlMultiLineParser.parse(SqlMultiLineParser.java:71)
> ~[flink-sql-client-1.16.1.jar:1.16.1]
> at
> org.jline.reader.impl.LineReaderImpl.acceptLine(LineReaderImpl.java:2964)
> ~[flink-sql-client-1.16.1.jar:1.16.1]
> at
> org.jline.reader.impl.LineReaderImpl$1.apply(LineReaderImpl.java:3778)
> ~[flink-sql-client-1.16.1.jar:1.16.1]
> at
> org.jline.reader.impl.LineReaderImpl.readLine(LineReaderImpl.java:679)
> ~[flink-sql-client-1.16.1.jar:1.16.1]
> at
> org.apache.flink.table.client.cli.CliClient.getAndExecuteStatements(CliClient.java:295)
> [flink-sql-client-1.16.1.jar:1.16.1]
> at
> org.apache.flink.table.client.cli.CliClient.executeInteractive(CliClient.java:280)
> [flink-sql-client-1.16.1.jar:1.16.1]
> at
> org.apache.flink.table.client.cli.CliClient.executeInInteractiveMode(CliClient.java:228)
> [flink-sql-client-1.16.1.jar:1.16.1]
> at org.apache.flink.table.client.SqlClient.openCli(SqlClient.java:151)
> [flink-sql-client-1.16.1.jar:1.16.1]
> at org.apache.flink.table.client.SqlClient.start(SqlClient.java:95)
> [flink-sql-client-1.16.1.jar:1.16.1]
> at
> org.apache.flink.table.client.SqlClient.startClient(SqlClient.java:187)
> [flink-sql-client-1.16.1.jar:1.16.1]
> at org.apache.flink.table.client.SqlClient.main(SqlClient.java:161)
> [flink-sql-client-1.16.1.jar:1.16.1]
> Caused by: org.apache.flink.table.api.ValidationException: Unable to create a
> source for reading table 'default_catalog.default_database.T'.
> Table options are:
> 'connector'='filesystem'
> 'format'='json'
> 'path'='/tmp/gao.json'
> at
> org.apache.flink.table.factories.FactoryUtil.createDynamicTableSource(FactoryUtil.java:166)
> ~[flink-table-api-java-uber-1.16.1.jar:1.16.1]
> at
> org.apache.flink.table.factories.FactoryUtil.createDynamicTableSource(FactoryUtil.java:191)
> ~[flink-table-api-java-uber-1.16.1.jar:1.16.1]
> at
> org.apache.flink.table.planner.plan.schema.CatalogSourceTable.createDynamicTableSource(CatalogSourceTable.java:175)
> ~[?:?]
> at
> org.apache.flink.table.planner.plan.schema.CatalogSourceTable.toRel(CatalogSourceTable.java:115)
> ~[?:?]
> at
> org.apache.calcite.sql2rel.SqlToRelConverter.toRel(SqlToRelConverter.java:3619)
> ~[?:?]
> at
> org.apache.calcite.sql2rel.SqlToRelConverter.convertIdentifier(SqlToRelConverter.java:2559)
> ~[?:?]
> at
> org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2175)
> ~[?:?]
> at
> org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2095)
> ~[?:?]
> at
> org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2038)
> ~[?:?]
> at
> org.apache.calcite.sql2rel.SqlToRelConverter.convertSelectImpl(SqlToRelConverter.java:669)
> ~[?:?]
> at
> org.apache.calcite.sql2rel.SqlToRelConverter.convertSelect(SqlToRelConverter.java:657)
> ~[?:?]
> at
> org.apache.calcite.sql2rel.SqlToRelConverter.convertQueryRecursive(SqlToRelConverter.java:3462)
> ~[?:?]
> at
> org.apache.calcite.sql2rel.SqlToRelConverter.convertQuery(SqlToRelConverter.java:570)
> ~[?:?]
> at
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$rel(FlinkPlannerImpl.scala:215)
> ~[?:?]
> at
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.rel(FlinkPlannerImpl.scala:191)
> ~[?:?]
> at
> org.apache.flink.table.planner.operations.SqlToOperationConverter.toQueryOperation(SqlToOperationConverter.java:1498)
> ~[?:?]
> at
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convertSqlQuery(SqlToOperationConverter.java:1253)
> ~[?:?]
> at
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convertValidatedSqlNode(SqlToOperationConverter.java:374)
> ~[?:?]
> at
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convertValidatedSqlNodeOrFail(SqlToOperationConverter.java:384)
> ~[?:?]
> at
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convertSqlInsert(SqlToOperationConverter.java:828)
> ~[?:?]
> at
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convertValidatedSqlNode(SqlToOperationConverter.java:351)
> ~[?:?]
> at
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:262)
> ~[?:?]
> at
> org.apache.flink.table.planner.delegation.ParserImpl.parse(ParserImpl.java:106)
> ~[?:?]
> at
> org.apache.flink.table.client.gateway.local.LocalExecutor.parseStatement(LocalExecutor.java:172)
> ~[flink-sql-client-1.16.1.jar:1.16.1]
> ... 12 more
> Caused by: java.lang.NoClassDefFoundError:
> org/apache/flink/table/planner/delegation/DialectFactory
> at java.lang.ClassLoader.defineClass1(Native Method) ~[?:1.8.0_361]
> at java.lang.ClassLoader.defineClass(ClassLoader.java:756)
> ~[?:1.8.0_361]
> at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
> ~[?:1.8.0_361]
> at java.net.URLClassLoader.defineClass(URLClassLoader.java:473)
> ~[?:1.8.0_361]
> at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
> ~[?:1.8.0_361]
> at java.net.URLClassLoader$1.run(URLClassLoader.java:369) ~[?:1.8.0_361]
> at java.net.URLClassLoader$1.run(URLClassLoader.java:363) ~[?:1.8.0_361]
> at java.security.AccessController.doPrivileged(Native Method)
> ~[?:1.8.0_361]
> at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
> ~[?:1.8.0_361]
> at java.lang.ClassLoader.loadClass(ClassLoader.java:418) ~[?:1.8.0_361]
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)
> ~[?:1.8.0_361]
> at java.lang.ClassLoader.loadClass(ClassLoader.java:405) ~[?:1.8.0_361]
> at
> org.apache.flink.util.FlinkUserCodeClassLoader.loadClassWithoutExceptionHandling(FlinkUserCodeClassLoader.java:67)
> ~[flink-dist-1.16.1.jar:1.16.1]
> at
> org.apache.flink.util.ChildFirstClassLoader.loadClassWithoutExceptionHandling(ChildFirstClassLoader.java:65)
> ~[flink-dist-1.16.1.jar:1.16.1]
> at
> org.apache.flink.util.FlinkUserCodeClassLoader.loadClass(FlinkUserCodeClassLoader.java:51)
> ~[flink-dist-1.16.1.jar:1.16.1]
> at java.lang.ClassLoader.loadClass(ClassLoader.java:351) ~[?:1.8.0_361]
> at
> org.apache.flink.util.FlinkUserCodeClassLoaders$SafetyNetWrapperClassLoader.loadClass(FlinkUserCodeClassLoaders.java:192)
> ~[flink-dist-1.16.1.jar:1.16.1]
> at java.lang.Class.forName0(Native Method) ~[?:1.8.0_361]
> at java.lang.Class.forName(Class.java:348) ~[?:1.8.0_361]
> at
> java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:370)
> ~[?:1.8.0_361]
> at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
> ~[?:1.8.0_361]
> at java.util.ServiceLoader$1.next(ServiceLoader.java:480) ~[?:1.8.0_361]
> at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[?:1.8.0_361]
> at
> org.apache.flink.connector.file.table.FileSystemTableFactory.formatFactoryExists(FileSystemTableFactory.java:206)
> ~[flink-connector-files-1.16.1.jar:1.16.1]
> at
> org.apache.flink.connector.file.table.FileSystemTableFactory.discoverDecodingFormat(FileSystemTableFactory.java:172)
> ~[flink-connector-files-1.16.1.jar:1.16.1]
> at
> org.apache.flink.connector.file.table.FileSystemTableFactory.createDynamicTableSource(FileSystemTableFactory.java:77)
> ~[flink-connector-files-1.16.1.jar:1.16.1]
> at
> org.apache.flink.table.factories.FactoryUtil.createDynamicTableSource(FactoryUtil.java:163)
> ~[flink-table-api-java-uber-1.16.1.jar:1.16.1]
> at
> org.apache.flink.table.factories.FactoryUtil.createDynamicTableSource(FactoryUtil.java:191)
> ~[flink-table-api-java-uber-1.16.1.jar:1.16.1]
> at
> org.apache.flink.table.planner.plan.schema.CatalogSourceTable.createDynamicTableSource(CatalogSourceTable.java:175)
> ~[?:?]
> at
> org.apache.flink.table.planner.plan.schema.CatalogSourceTable.toRel(CatalogSourceTable.java:115)
> ~[?:?]
> at
> org.apache.calcite.sql2rel.SqlToRelConverter.toRel(SqlToRelConverter.java:3619)
> ~[?:?]
> at
> org.apache.calcite.sql2rel.SqlToRelConverter.convertIdentifier(SqlToRelConverter.java:2559)
> ~[?:?]
> at
> org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2175)
> ~[?:?]
> at
> org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2095)
> ~[?:?]
> at
> org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2038)
> ~[?:?]
> at
> org.apache.calcite.sql2rel.SqlToRelConverter.convertSelectImpl(SqlToRelConverter.java:669)
> ~[?:?]
> at
> org.apache.calcite.sql2rel.SqlToRelConverter.convertSelect(SqlToRelConverter.java:657)
> ~[?:?]
> at
> org.apache.calcite.sql2rel.SqlToRelConverter.convertQueryRecursive(SqlToRelConverter.java:3462)
> ~[?:?]
> at
> org.apache.calcite.sql2rel.SqlToRelConverter.convertQuery(SqlToRelConverter.java:570)
> ~[?:?]
> at
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$rel(FlinkPlannerImpl.scala:215)
> ~[?:?]
> at
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.rel(FlinkPlannerImpl.scala:191)
> ~[?:?]
> at
> org.apache.flink.table.planner.operations.SqlToOperationConverter.toQueryOperation(SqlToOperationConverter.java:1498)
> ~[?:?]
> at
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convertSqlQuery(SqlToOperationConverter.java:1253)
> ~[?:?]
> at
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convertValidatedSqlNode(SqlToOperationConverter.java:374)
> ~[?:?]
> at
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convertValidatedSqlNodeOrFail(SqlToOperationConverter.java:384)
> ~[?:?]
> at
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convertSqlInsert(SqlToOperationConverter.java:828)
> ~[?:?]
> at
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convertValidatedSqlNode(SqlToOperationConverter.java:351)
> ~[?:?]
> at
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:262)
> ~[?:?]
> at
> org.apache.flink.table.planner.delegation.ParserImpl.parse(ParserImpl.java:106)
> ~[?:?]
> at
> org.apache.flink.table.client.gateway.local.LocalExecutor.parseStatement(LocalExecutor.java:172)
> ~[flink-sql-client-1.16.1.jar:1.16.1]
> ... 12 more
> Caused by: java.lang.ClassNotFoundException:
> org.apache.flink.table.planner.delegation.DialectFactory
> at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
> ~[?:1.8.0_361]
> at java.lang.ClassLoader.loadClass(ClassLoader.java:418) ~[?:1.8.0_361]
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)
> ~[?:1.8.0_361]
> at java.lang.ClassLoader.loadClass(ClassLoader.java:351) ~[?:1.8.0_361]
> at java.lang.ClassLoader.defineClass1(Native Method) ~[?:1.8.0_361]
> at java.lang.ClassLoader.defineClass(ClassLoader.java:756)
> ~[?:1.8.0_361]
> at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
> ~[?:1.8.0_361]
> at java.net.URLClassLoader.defineClass(URLClassLoader.java:473)
> ~[?:1.8.0_361]
> at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
> ~[?:1.8.0_361]
> at java.net.URLClassLoader$1.run(URLClassLoader.java:369) ~[?:1.8.0_361]
> at java.net.URLClassLoader$1.run(URLClassLoader.java:363) ~[?:1.8.0_361]
> at java.security.AccessController.doPrivileged(Native Method)
> ~[?:1.8.0_361]
> at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
> ~[?:1.8.0_361]
> at java.lang.ClassLoader.loadClass(ClassLoader.java:418) ~[?:1.8.0_361]
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)
> ~[?:1.8.0_361]
> at java.lang.ClassLoader.loadClass(ClassLoader.java:405) ~[?:1.8.0_361]
> at
> org.apache.flink.util.FlinkUserCodeClassLoader.loadClassWithoutExceptionHandling(FlinkUserCodeClassLoader.java:67)
> ~[flink-dist-1.16.1.jar:1.16.1]
> at
> org.apache.flink.util.ChildFirstClassLoader.loadClassWithoutExceptionHandling(ChildFirstClassLoader.java:65)
> ~[flink-dist-1.16.1.jar:1.16.1]
> at
> org.apache.flink.util.FlinkUserCodeClassLoader.loadClass(FlinkUserCodeClassLoader.java:51)
> ~[flink-dist-1.16.1.jar:1.16.1]
> at java.lang.ClassLoader.loadClass(ClassLoader.java:351) ~[?:1.8.0_361]
> at
> org.apache.flink.util.FlinkUserCodeClassLoaders$SafetyNetWrapperClassLoader.loadClass(FlinkUserCodeClassLoaders.java:192)
> ~[flink-dist-1.16.1.jar:1.16.1]
> at java.lang.Class.forName0(Native Method) ~[?:1.8.0_361]
> at java.lang.Class.forName(Class.java:348) ~[?:1.8.0_361]
> at
> java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:370)
> ~[?:1.8.0_361]
> at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
> ~[?:1.8.0_361]
> at java.util.ServiceLoader$1.next(ServiceLoader.java:480) ~[?:1.8.0_361]
> at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[?:1.8.0_361]
> at
> org.apache.flink.connector.file.table.FileSystemTableFactory.formatFactoryExists(FileSystemTableFactory.java:206)
> ~[flink-connector-files-1.16.1.jar:1.16.1]
> at
> org.apache.flink.connector.file.table.FileSystemTableFactory.discoverDecodingFormat(FileSystemTableFactory.java:172)
> ~[flink-connector-files-1.16.1.jar:1.16.1]
> at
> org.apache.flink.connector.file.table.FileSystemTableFactory.createDynamicTableSource(FileSystemTableFactory.java:77)
> ~[flink-connector-files-1.16.1.jar:1.16.1]
> at
> org.apache.flink.table.factories.FactoryUtil.createDynamicTableSource(FactoryUtil.java:163)
> ~[flink-table-api-java-uber-1.16.1.jar:1.16.1]
> at
> org.apache.flink.table.factories.FactoryUtil.createDynamicTableSource(FactoryUtil.java:191)
> ~[flink-table-api-java-uber-1.16.1.jar:1.16.1]
> at
> org.apache.flink.table.planner.plan.schema.CatalogSourceTable.createDynamicTableSource(CatalogSourceTable.java:175)
> ~[?:?]
> at
> org.apache.flink.table.planner.plan.schema.CatalogSourceTable.toRel(CatalogSourceTable.java:115)
> ~[?:?]
> at
> org.apache.calcite.sql2rel.SqlToRelConverter.toRel(SqlToRelConverter.java:3619)
> ~[?:?]
> at
> org.apache.calcite.sql2rel.SqlToRelConverter.convertIdentifier(SqlToRelConverter.java:2559)
> ~[?:?]
> at
> org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2175)
> ~[?:?]
> at
> org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2095)
> ~[?:?]
> at
> org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2038)
> ~[?:?]
> at
> org.apache.calcite.sql2rel.SqlToRelConverter.convertSelectImpl(SqlToRelConverter.java:669)
> ~[?:?]
> at
> org.apache.calcite.sql2rel.SqlToRelConverter.convertSelect(SqlToRelConverter.java:657)
> ~[?:?]
> at
> org.apache.calcite.sql2rel.SqlToRelConverter.convertQueryRecursive(SqlToRelConverter.java:3462)
> ~[?:?]
> at
> org.apache.calcite.sql2rel.SqlToRelConverter.convertQuery(SqlToRelConverter.java:570)
> ~[?:?]
> at
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$rel(FlinkPlannerImpl.scala:215)
> ~[?:?]
> at
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.rel(FlinkPlannerImpl.scala:191)
> ~[?:?]
> at
> org.apache.flink.table.planner.operations.SqlToOperationConverter.toQueryOperation(SqlToOperationConverter.java:1498)
> ~[?:?]
> at
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convertSqlQuery(SqlToOperationConverter.java:1253)
> ~[?:?]
> at
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convertValidatedSqlNode(SqlToOperationConverter.java:374)
> ~[?:?]
> at
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convertValidatedSqlNodeOrFail(SqlToOperationConverter.java:384)
> ~[?:?]
> at
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convertSqlInsert(SqlToOperationConverter.java:828)
> ~[?:?]
> at
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convertValidatedSqlNode(SqlToOperationConverter.java:351)
> ~[?:?]
> at
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:262)
> ~[?:?]
> at
> org.apache.flink.table.planner.delegation.ParserImpl.parse(ParserImpl.java:106)
> ~[?:?]
> at
> org.apache.flink.table.client.gateway.local.LocalExecutor.parseStatement(LocalExecutor.java:172)
> ~[flink-sql-client-1.16.1.jar:1.16.1]
> ... 12 more
> {code}
> The
> [document|https://nightlies.apache.org/flink/flink-docs-release-1.16/docs/connectors/table/hive/overview/#moving-the-planner-jar]
> states that if we're not using Hive dialect, we don't have to move the
> planner jar. However we clearly did not use Hive dialect in the above SQL,
> but the exception still happens.
> This bug only happens with filesystem tables. Other tables do not have this
> issue.
--
This message was sent by Atlassian Jira
(v8.20.10#820010)