[
https://issues.apache.org/jira/browse/FLINK-29218?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17602274#comment-17602274
]
dalongliu edited comment on FLINK-29218 at 9/9/22 11:27 AM:
------------------------------------------------------------
After some tests, I found it is not easy to resolve the class loading issue of
hadoop and hive, so I think we should not use hive catalog and by add jar
syntax, we should place it in the lib folder. The add jar has some restrictions
to hive connector, we should remind the user how to use hive connector
correctly in the release note.
was (Author: lsy):
After some tests, I found it is not easy to resolve the class loading issue of
hadoop and hive, so I think we should use hive catalog and by add jar syntax,
we should place it in the lib folder. The add jar has some restrictions to hive
connector, we should remind the user how to use hive connector correctly in the
release note.
> ADD JAR syntax could not work with Hive catalog in SQL client
> -------------------------------------------------------------
>
> Key: FLINK-29218
> URL: https://issues.apache.org/jira/browse/FLINK-29218
> Project: Flink
> Issue Type: Bug
> Components: Connectors / Hive, Table SQL / API
> Affects Versions: 1.16.0
> Reporter: Qingsheng Ren
> Assignee: dalongliu
> Priority: Major
>
> ADD JAR syntax is not working for adding Hive / Hadoop dependencies into SQL
> client.
> To reproduce the problem:
> # Place Hive connector and Hadoop JAR outside {{{}lib{}}}, and add them into
> the session using {{ADD JAR}} syntax.
> # Create a Hive catalog using {{CREATE CATALOG}}
> Exception thrown by SQL client:
> {code:java}
> 2022-09-07 15:23:15,737 WARN org.apache.flink.table.client.cli.CliClient
> [] - Could not execute SQL statement.
> org.apache.flink.table.client.gateway.SqlExecutionException: Could not
> execute SQL statement.
> at
> org.apache.flink.table.client.gateway.local.LocalExecutor.executeOperation(LocalExecutor.java:208)
> ~[flink-sql-client-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.client.cli.CliClient.executeOperation(CliClient.java:634)
> ~[flink-sql-client-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.client.cli.CliClient.callOperation(CliClient.java:468)
> [flink-sql-client-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.client.cli.CliClient.executeOperation(CliClient.java:371)
> [flink-sql-client-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.client.cli.CliClient.getAndExecuteStatements(CliClient.java:328)
> [flink-sql-client-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.client.cli.CliClient.executeInteractive(CliClient.java:279)
> [flink-sql-client-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.client.cli.CliClient.executeInInteractiveMode(CliClient.java:227)
> [flink-sql-client-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at org.apache.flink.table.client.SqlClient.openCli(SqlClient.java:151)
> [flink-sql-client-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at org.apache.flink.table.client.SqlClient.start(SqlClient.java:95)
> [flink-sql-client-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.client.SqlClient.startClient(SqlClient.java:187)
> [flink-sql-client-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at org.apache.flink.table.client.SqlClient.main(SqlClient.java:161)
> [flink-sql-client-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> Caused by: org.apache.flink.table.api.ValidationException: Unable to create
> catalog 'myhive'.Catalog options are:
> 'hive-conf-dir'='file:///Users/renqs/Workspaces/flink/flink-master/build-target/opt/hive-conf'
> 'type'='hive'
> at
> org.apache.flink.table.factories.FactoryUtil.createCatalog(FactoryUtil.java:438)
> ~[flink-table-api-java-uber-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.api.internal.TableEnvironmentImpl.createCatalog(TableEnvironmentImpl.java:1423)
> ~[flink-table-api-java-uber-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:1165)
> ~[flink-table-api-java-uber-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.client.gateway.local.LocalExecutor.executeOperation(LocalExecutor.java:206)
> ~[flink-sql-client-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> ... 10 more
> Caused by: org.apache.flink.table.catalog.exceptions.CatalogException: Failed
> to load hive-site.xml from specified
> path:file:/Users/renqs/Workspaces/flink/flink-master/build-target/opt/hive-conf/hive-site.xml
> at
> org.apache.flink.table.catalog.hive.HiveCatalog.createHiveConf(HiveCatalog.java:273)
> ~[?:?]
> at
> org.apache.flink.table.catalog.hive.HiveCatalog.<init>(HiveCatalog.java:184)
> ~[?:?]
> at
> org.apache.flink.table.catalog.hive.factories.HiveCatalogFactory.createCatalog(HiveCatalogFactory.java:76)
> ~[?:?]
> at
> org.apache.flink.table.factories.FactoryUtil.createCatalog(FactoryUtil.java:435)
> ~[flink-table-api-java-uber-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.api.internal.TableEnvironmentImpl.createCatalog(TableEnvironmentImpl.java:1423)
> ~[flink-table-api-java-uber-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:1165)
> ~[flink-table-api-java-uber-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.client.gateway.local.LocalExecutor.executeOperation(LocalExecutor.java:206)
> ~[flink-sql-client-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> ... 10 more
> Caused by: java.io.IOException: No FileSystem for scheme: file
> at
> org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2799)
> ~[?:?]
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2810)
> ~[?:?]
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:100) ~[?:?]
> at
> org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2849) ~[?:?]
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2831) ~[?:?]
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:389) ~[?:?]
> at org.apache.hadoop.fs.Path.getFileSystem(Path.java:356) ~[?:?]
> at
> org.apache.flink.table.catalog.hive.HiveCatalog.createHiveConf(HiveCatalog.java:268)
> ~[?:?]
> at
> org.apache.flink.table.catalog.hive.HiveCatalog.<init>(HiveCatalog.java:184)
> ~[?:?]
> at
> org.apache.flink.table.catalog.hive.factories.HiveCatalogFactory.createCatalog(HiveCatalogFactory.java:76)
> ~[?:?]
> at
> org.apache.flink.table.factories.FactoryUtil.createCatalog(FactoryUtil.java:435)
> ~[flink-table-api-java-uber-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.api.internal.TableEnvironmentImpl.createCatalog(TableEnvironmentImpl.java:1423)
> ~[flink-table-api-java-uber-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:1165)
> ~[flink-table-api-java-uber-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.client.gateway.local.LocalExecutor.executeOperation(LocalExecutor.java:206)
> ~[flink-sql-client-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> ... 10 more
> {code}
>
> Another test:
> # Put Hadoop JAR inside {{{}lib{}}}, and Hive connector outside, then add
> Hive connector JAR into the session with {{ADD JAR}} syntax
> # Create a Hive catalog using {{CREATE CATALOG}}
> Exception thrown by SQL client:
> {code:java}
> 2022-09-07 15:29:57,362 WARN org.apache.flink.table.client.cli.CliClient
> [] - Could not execute SQL statement.
> org.apache.flink.table.client.gateway.SqlExecutionException: Could not
> execute SQL statement.
> at
> org.apache.flink.table.client.gateway.local.LocalExecutor.executeOperation(LocalExecutor.java:208)
> ~[flink-sql-client-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.client.cli.CliClient.executeOperation(CliClient.java:634)
> ~[flink-sql-client-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.client.cli.CliClient.callOperation(CliClient.java:468)
> [flink-sql-client-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.client.cli.CliClient.executeOperation(CliClient.java:371)
> [flink-sql-client-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.client.cli.CliClient.getAndExecuteStatements(CliClient.java:328)
> [flink-sql-client-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.client.cli.CliClient.executeInteractive(CliClient.java:279)
> [flink-sql-client-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.client.cli.CliClient.executeInInteractiveMode(CliClient.java:227)
> [flink-sql-client-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at org.apache.flink.table.client.SqlClient.openCli(SqlClient.java:151)
> [flink-sql-client-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at org.apache.flink.table.client.SqlClient.start(SqlClient.java:95)
> [flink-sql-client-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.client.SqlClient.startClient(SqlClient.java:187)
> [flink-sql-client-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at org.apache.flink.table.client.SqlClient.main(SqlClient.java:161)
> [flink-sql-client-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> Caused by: org.apache.flink.table.api.ValidationException: Could not execute
> CREATE CATALOG: (catalogName: [myhive], properties:
> [{hive-conf-dir=file:///Users/renqs/Workspaces/flink/flink-master/build-target/opt/hive-conf,
> type=hive}])
> at
> org.apache.flink.table.api.internal.TableEnvironmentImpl.createCatalog(TableEnvironmentImpl.java:1432)
> ~[flink-table-api-java-uber-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:1165)
> ~[flink-table-api-java-uber-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.client.gateway.local.LocalExecutor.executeOperation(LocalExecutor.java:206)
> ~[flink-sql-client-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> ... 10 more
> Caused by: org.apache.flink.table.catalog.exceptions.CatalogException: Failed
> to create Hive Metastore client
> at
> org.apache.flink.table.catalog.hive.client.HiveShimV230.getHiveMetastoreClient(HiveShimV230.java:74)
> ~[?:?]
> at
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.createMetastoreClient(HiveMetastoreClientWrapper.java:283)
> ~[?:?]
> at
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.<init>(HiveMetastoreClientWrapper.java:84)
> ~[?:?]
> at
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.<init>(HiveMetastoreClientWrapper.java:74)
> ~[?:?]
> at
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientFactory.create(HiveMetastoreClientFactory.java:32)
> ~[?:?]
> at
> org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:300)
> ~[?:?]
> at
> org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:211)
> ~[flink-table-api-java-uber-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.api.internal.TableEnvironmentImpl.createCatalog(TableEnvironmentImpl.java:1428)
> ~[flink-table-api-java-uber-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:1165)
> ~[flink-table-api-java-uber-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.client.gateway.local.LocalExecutor.executeOperation(LocalExecutor.java:206)
> ~[flink-sql-client-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> ... 10 more
> Caused by: java.lang.reflect.InvocationTargetException
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> ~[?:1.8.0_292]
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> ~[?:1.8.0_292]
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> ~[?:1.8.0_292]
> at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_292]
> at
> org.apache.flink.table.catalog.hive.client.HiveShimV230.getHiveMetastoreClient(HiveShimV230.java:72)
> ~[?:?]
> at
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.createMetastoreClient(HiveMetastoreClientWrapper.java:283)
> ~[?:?]
> at
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.<init>(HiveMetastoreClientWrapper.java:84)
> ~[?:?]
> at
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.<init>(HiveMetastoreClientWrapper.java:74)
> ~[?:?]
> at
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientFactory.create(HiveMetastoreClientFactory.java:32)
> ~[?:?]
> at
> org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:300)
> ~[?:?]
> at
> org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:211)
> ~[flink-table-api-java-uber-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.api.internal.TableEnvironmentImpl.createCatalog(TableEnvironmentImpl.java:1428)
> ~[flink-table-api-java-uber-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:1165)
> ~[flink-table-api-java-uber-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.client.gateway.local.LocalExecutor.executeOperation(LocalExecutor.java:206)
> ~[flink-sql-client-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> ... 10 more
> Caused by: org.apache.hadoop.hive.metastore.api.MetaException:
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient class not found
> at
> org.apache.hadoop.hive.metastore.MetaStoreUtils.getClass(MetaStoreUtils.java:1710)
> ~[?:?]
> at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:131)
> ~[?:?]
> at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:89)
> ~[?:?]
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> ~[?:1.8.0_292]
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> ~[?:1.8.0_292]
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> ~[?:1.8.0_292]
> at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_292]
> at
> org.apache.flink.table.catalog.hive.client.HiveShimV230.getHiveMetastoreClient(HiveShimV230.java:72)
> ~[?:?]
> at
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.createMetastoreClient(HiveMetastoreClientWrapper.java:283)
> ~[?:?]
> at
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.<init>(HiveMetastoreClientWrapper.java:84)
> ~[?:?]
> at
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.<init>(HiveMetastoreClientWrapper.java:74)
> ~[?:?]
> at
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientFactory.create(HiveMetastoreClientFactory.java:32)
> ~[?:?]
> at
> org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:300)
> ~[?:?]
> at
> org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:211)
> ~[flink-table-api-java-uber-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.api.internal.TableEnvironmentImpl.createCatalog(TableEnvironmentImpl.java:1428)
> ~[flink-table-api-java-uber-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:1165)
> ~[flink-table-api-java-uber-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> at
> org.apache.flink.table.client.gateway.local.LocalExecutor.executeOperation(LocalExecutor.java:206)
> ~[flink-sql-client-1.16-SNAPSHOT.jar:1.16-SNAPSHOT]
> ... 10 more {code}
--
This message was sent by Atlassian Jira
(v8.20.10#820010)