[ 
https://issues.apache.org/jira/browse/FLINK-30646?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Caizhi Weng closed FLINK-30646.
-------------------------------
    Resolution: Fixed

master: 072640a72fe18bf3c0439cd4f0ec7602e0b0ff80
release-0.3: fcea22e114888af239e3a04e130f942df655122e

> Table Store Hive catalog throws ClassNotFoundException when custom 
> hive-site.xml is presented
> ---------------------------------------------------------------------------------------------
>
>                 Key: FLINK-30646
>                 URL: https://issues.apache.org/jira/browse/FLINK-30646
>             Project: Flink
>          Issue Type: Bug
>          Components: Table Store
>    Affects Versions: table-store-0.3.0, table-store-0.4.0
>            Reporter: Caizhi Weng
>            Assignee: Caizhi Weng
>            Priority: Major
>              Labels: pull-request-available
>             Fix For: table-store-0.4.0, table-store-0.3.1
>
>
> For Hive 2.3.9, if a custom {{hive-site.xml}} is presented in 
> {{$HIVE_HOME/conf}}, when creating Table Store Hive catalog in Flink, the 
> following exception will be thrown.
> {code}
> Caused by: java.lang.ClassNotFoundException: Class 
> org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl not found
>       at 
> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2273) 
> ~[flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]
>       at 
> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2367) 
> ~[flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]
>       at 
> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2393) 
> ~[flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]
>       at 
> org.apache.flink.table.store.shaded.org.apache.hadoop.hive.metastore.HiveMetaStoreClient.loadFilterHooks(HiveMetaStoreClient.java:250)
>  ~[flink-table-store-hive-catalog-0.4-SNAPSHOT.jar:0.4-SNAPSHOT]
>       at 
> org.apache.flink.table.store.shaded.org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:145)
>  ~[flink-table-store-hive-catalog-0.4-SNAPSHOT.jar:0.4-SNAPSHOT]
>       at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
> Method) ~[?:1.8.0_352]
>       at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>  ~[?:1.8.0_352]
>       at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>  ~[?:1.8.0_352]
>       at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 
> ~[?:1.8.0_352]
>       at 
> org.apache.flink.table.store.shaded.org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1740)
>  ~[flink-table-store-hive-catalog-0.4-SNAPSHOT.jar:0.4-SNAPSHOT]
>       at 
> org.apache.flink.table.store.shaded.org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:83)
>  ~[flink-table-store-hive-catalog-0.4-SNAPSHOT.jar:0.4-SNAPSHOT]
>       at 
> org.apache.flink.table.store.shaded.org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:133)
>  ~[flink-table-store-hive-catalog-0.4-SNAPSHOT.jar:0.4-SNAPSHOT]
>       at 
> org.apache.flink.table.store.shaded.org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
>  ~[flink-table-store-hive-catalog-0.4-SNAPSHOT.jar:0.4-SNAPSHOT]
>       at 
> org.apache.flink.table.store.shaded.org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:97)
>  ~[flink-table-store-hive-catalog-0.4-SNAPSHOT.jar:0.4-SNAPSHOT]
>       at 
> org.apache.flink.table.store.hive.HiveCatalog.createClient(HiveCatalog.java:415)
>  ~[flink-table-store-hive-catalog-0.4-SNAPSHOT.jar:0.4-SNAPSHOT]
>       at 
> org.apache.flink.table.store.hive.HiveCatalog.<init>(HiveCatalog.java:82) 
> ~[flink-table-store-hive-catalog-0.4-SNAPSHOT.jar:0.4-SNAPSHOT]
>       at 
> org.apache.flink.table.store.hive.HiveCatalogFactory.create(HiveCatalogFactory.java:51)
>  ~[flink-table-store-hive-catalog-0.4-SNAPSHOT.jar:0.4-SNAPSHOT]
>       at 
> org.apache.flink.table.store.file.catalog.CatalogFactory.createCatalog(CatalogFactory.java:106)
>  ~[flink-table-store-dist-0.4-SNAPSHOT.jar:0.4-SNAPSHOT]
>       at 
> org.apache.flink.table.store.connector.FlinkCatalogFactory.createCatalog(FlinkCatalogFactory.java:66)
>  ~[flink-table-store-dist-0.4-SNAPSHOT.jar:0.4-SNAPSHOT]
>       at 
> org.apache.flink.table.store.connector.FlinkCatalogFactory.createCatalog(FlinkCatalogFactory.java:57)
>  ~[flink-table-store-dist-0.4-SNAPSHOT.jar:0.4-SNAPSHOT]
>       at 
> org.apache.flink.table.store.connector.FlinkCatalogFactory.createCatalog(FlinkCatalogFactory.java:31)
>  ~[flink-table-store-dist-0.4-SNAPSHOT.jar:0.4-SNAPSHOT]
>       at 
> org.apache.flink.table.factories.FactoryUtil.createCatalog(FactoryUtil.java:435)
>  ~[flink-table-api-java-uber-1.16.0.jar:1.16.0]
>       at 
> org.apache.flink.table.api.internal.TableEnvironmentImpl.createCatalog(TableEnvironmentImpl.java:1426)
>  ~[flink-table-api-java-uber-1.16.0.jar:1.16.0]
>       at 
> org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:1172)
>  ~[flink-table-api-java-uber-1.16.0.jar:1.16.0]
>       at 
> org.apache.flink.table.client.gateway.local.LocalExecutor.executeOperation(LocalExecutor.java:206)
>  ~[flink-sql-client-1.16.0.jar:1.16.0]
>       ... 10 more
> {code}
> This is because {{hive-default.xml.template}} contains a property named 
> {{hive.metastore.filter.hook}}. Its default value is 
> {{org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl}}. However 
> all Hive packages in Table Store are shaded, so the class loader cannot find 
> the original class.
> we need to remove relocation of Hive classes.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to