Reo-LEI opened a new issue #3363:
URL: https://github.com/apache/iceberg/issues/3363


   Since https://github.com/apache/iceberg/pull/3099 I encounter the 
`NoSuchMethodException`  when I submit flink job to flink cluster which hive 
metastore version is 2.1.1. 
   
   I found in #3099, `HiveClientPool` will dynamically call 
`RetryingMetaStoreClient.getProxy(HiveConf.class, Boolean.TYPE)` methon and the 
`getProxy` will call `HiveMetaStoreClient(HiveConf.class, Boolean.class)` 
[constructor](https://github.com/apache/hive/blob/1af77bbf8356e86cabbed92cfa8cc2e1470a1d5c/metastore/src/java/org/apache/hadoop/hive/metastore/RetryingMetaStoreClient.java#L86),
 but the `HiveMetaStoreClient` don't have this 
[constructor](https://github.com/apache/hive/blob/1af77bbf8356e86cabbed92cfa8cc2e1470a1d5c/metastore/src/java/org/apache/hadoop/hive/metastore/HiveMetaStoreClient.java#L209).
 Finally the `NoSuchMethodException` raised.
   
   I think we should find a way to be compatible with this. @szehon-ho @rdblue 
@pvary @jackye1995 @marton-bod 
   
   ```
   org.apache.flink.client.program.ProgramInvocationException: The main method 
caused an error: Unable to create a sink for writing table 
'default_catalog.default_database.table_name'.
   
   Table options are:
   ...
   
        at 
org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:366)
 ~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:219)
 ~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:114) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:814) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:246) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.client.cli.CliFrontend.parseAndRun(CliFrontend.java:1056) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:1134) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at java.security.AccessController.doPrivileged(Native Method) 
~[?:1.8.0_112]
        at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_112]
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754)
 [flink-shaded-hadoop-2-uber-2.7.5-7.0.jar:2.7.5-7.0]
        at 
org.apache.flink.runtime.security.contexts.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
 [flink-dist_2.12-1.12.1.jar:1.12.1]
        at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1134) 
[flink-dist_2.12-1.12.1.jar:1.12.1]
   Caused by: org.apache.flink.table.api.ValidationException: Unable to create 
a sink for writing table 
'default_catalog.default_database.ods_mysql_hive_exec_job_rewrite'.
   
   Table options are:
   
   ...
   
        at 
org.apache.flink.table.factories.FactoryUtil.createTableSink(FactoryUtil.java:156)
 ~[flink-table_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.table.planner.delegation.PlannerBase.getTableSink(PlannerBase.scala:369)
 ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.table.planner.delegation.PlannerBase.translateToRel(PlannerBase.scala:221)
 ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.table.planner.delegation.PlannerBase.$anonfun$translate$1(PlannerBase.scala:159)
 ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
        at 
scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:233) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.Iterator.foreach(Iterator.scala:937) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.Iterator.foreach$(Iterator.scala:937) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.AbstractIterator.foreach(Iterator.scala:1425) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.IterableLike.foreach(IterableLike.scala:70) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.IterableLike.foreach$(IterableLike.scala:69) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.AbstractIterable.foreach(Iterable.scala:54) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.TraversableLike.map(TraversableLike.scala:233) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.TraversableLike.map$(TraversableLike.scala:226) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.AbstractTraversable.map(Traversable.scala:104) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.scala:159)
 ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.translate(TableEnvironmentImpl.java:1329)
 ~[flink-table_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:676)
 ~[flink-table_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.executeOperation(TableEnvironmentImpl.java:767)
 ~[flink-table_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:666)
 ~[flink-table_2.12-1.12.1.jar:1.12.1]
        at 
com.huya.dc.walrus.lakehouse.flink.sql.FlinkSQLSubmitter.executeSQL(FlinkSQLSubmitter.java:156)
 ~[?:?]
        at 
com.huya.dc.walrus.lakehouse.flink.sql.FlinkSQLSubmitter.main(FlinkSQLSubmitter.java:112)
 ~[?:?]
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
~[?:1.8.0_112]
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
~[?:1.8.0_112]
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 ~[?:1.8.0_112]
        at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
        at 
org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:349)
 ~[flink-dist_2.12-1.12.1.jar:1.12.1]
        ... 11 more
   Caused by: org.apache.iceberg.hive.RuntimeMetaException: Failed to connect 
to Hive Metastore
        at 
org.apache.iceberg.hive.HiveClientPool.newClient(HiveClientPool.java:73) ~[?:?]
        at 
org.apache.iceberg.hive.HiveClientPool.newClient(HiveClientPool.java:33) ~[?:?]
        at org.apache.iceberg.ClientPoolImpl.get(ClientPoolImpl.java:125) ~[?:?]
        at org.apache.iceberg.ClientPoolImpl.run(ClientPoolImpl.java:56) ~[?:?]
        at org.apache.iceberg.ClientPoolImpl.run(ClientPoolImpl.java:51) ~[?:?]
        at 
org.apache.iceberg.hive.CachedClientPool.run(CachedClientPool.java:76) ~[?:?]
        at 
org.apache.iceberg.hive.HiveCatalog.loadNamespaceMetadata(HiveCatalog.java:386) 
~[?:?]
        at 
org.apache.iceberg.flink.FlinkCatalog.getDatabase(FlinkCatalog.java:173) ~[?:?]
        at 
org.apache.iceberg.flink.FlinkCatalog.databaseExists(FlinkCatalog.java:185) 
~[?:?]
        at 
org.apache.iceberg.flink.FlinkDynamicTableFactory.createTableLoader(FlinkDynamicTableFactory.java:163)
 ~[?:?]
        at 
org.apache.iceberg.flink.FlinkDynamicTableFactory.createDynamicTableSink(FlinkDynamicTableFactory.java:112)
 ~[?:?]
        at 
org.apache.flink.table.factories.FactoryUtil.createTableSink(FactoryUtil.java:153)
 ~[flink-table_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.table.planner.delegation.PlannerBase.getTableSink(PlannerBase.scala:369)
 ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.table.planner.delegation.PlannerBase.translateToRel(PlannerBase.scala:221)
 ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.table.planner.delegation.PlannerBase.$anonfun$translate$1(PlannerBase.scala:159)
 ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
        at 
scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:233) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.Iterator.foreach(Iterator.scala:937) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.Iterator.foreach$(Iterator.scala:937) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.AbstractIterator.foreach(Iterator.scala:1425) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.IterableLike.foreach(IterableLike.scala:70) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.IterableLike.foreach$(IterableLike.scala:69) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.AbstractIterable.foreach(Iterable.scala:54) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.TraversableLike.map(TraversableLike.scala:233) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.TraversableLike.map$(TraversableLike.scala:226) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.AbstractTraversable.map(Traversable.scala:104) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.scala:159)
 ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.translate(TableEnvironmentImpl.java:1329)
 ~[flink-table_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:676)
 ~[flink-table_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.executeOperation(TableEnvironmentImpl.java:767)
 ~[flink-table_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:666)
 ~[flink-table_2.12-1.12.1.jar:1.12.1]
        at 
com.huya.dc.walrus.lakehouse.flink.sql.FlinkSQLSubmitter.executeSQL(FlinkSQLSubmitter.java:156)
 ~[?:?]
        at 
com.huya.dc.walrus.lakehouse.flink.sql.FlinkSQLSubmitter.main(FlinkSQLSubmitter.java:112)
 ~[?:?]
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
~[?:1.8.0_112]
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
~[?:1.8.0_112]
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 ~[?:1.8.0_112]
        at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
        at 
org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:349)
 ~[flink-dist_2.12-1.12.1.jar:1.12.1]
        ... 11 more
   Caused by: java.lang.RuntimeException: Unable to instantiate 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient
        at 
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1654)
 ~[?:?]
        at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.(RetryingMetaStoreClient.java:83)
 ~[?:?]
        at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:133)
 ~[?:?]
        at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:89)
 ~[?:?]
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
~[?:1.8.0_112]
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
~[?:1.8.0_112]
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 ~[?:1.8.0_112]
        at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
        at 
org.apache.iceberg.common.DynMethods$UnboundMethod.invokeChecked(DynMethods.java:65)
 ~[?:?]
        at 
org.apache.iceberg.common.DynMethods$UnboundMethod.invoke(DynMethods.java:77) 
~[?:?]
        at 
org.apache.iceberg.common.DynMethods$StaticMethod.invoke(DynMethods.java:196) 
~[?:?]
        at 
org.apache.iceberg.hive.HiveClientPool.newClient(HiveClientPool.java:56) ~[?:?]
        at 
org.apache.iceberg.hive.HiveClientPool.newClient(HiveClientPool.java:33) ~[?:?]
        at org.apache.iceberg.ClientPoolImpl.get(ClientPoolImpl.java:125) ~[?:?]
        at org.apache.iceberg.ClientPoolImpl.run(ClientPoolImpl.java:56) ~[?:?]
        at org.apache.iceberg.ClientPoolImpl.run(ClientPoolImpl.java:51) ~[?:?]
        at 
org.apache.iceberg.hive.CachedClientPool.run(CachedClientPool.java:76) ~[?:?]
        at 
org.apache.iceberg.hive.HiveCatalog.loadNamespaceMetadata(HiveCatalog.java:386) 
~[?:?]
        at 
org.apache.iceberg.flink.FlinkCatalog.getDatabase(FlinkCatalog.java:173) ~[?:?]
        at 
org.apache.iceberg.flink.FlinkCatalog.databaseExists(FlinkCatalog.java:185) 
~[?:?]
        at 
org.apache.iceberg.flink.FlinkDynamicTableFactory.createTableLoader(FlinkDynamicTableFactory.java:163)
 ~[?:?]
        at 
org.apache.iceberg.flink.FlinkDynamicTableFactory.createDynamicTableSink(FlinkDynamicTableFactory.java:112)
 ~[?:?]
        at 
org.apache.flink.table.factories.FactoryUtil.createTableSink(FactoryUtil.java:153)
 ~[flink-table_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.table.planner.delegation.PlannerBase.getTableSink(PlannerBase.scala:369)
 ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.table.planner.delegation.PlannerBase.translateToRel(PlannerBase.scala:221)
 ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.table.planner.delegation.PlannerBase.$anonfun$translate$1(PlannerBase.scala:159)
 ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
        at 
scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:233) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.Iterator.foreach(Iterator.scala:937) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.Iterator.foreach$(Iterator.scala:937) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.AbstractIterator.foreach(Iterator.scala:1425) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.IterableLike.foreach(IterableLike.scala:70) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.IterableLike.foreach$(IterableLike.scala:69) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.AbstractIterable.foreach(Iterable.scala:54) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.TraversableLike.map(TraversableLike.scala:233) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.TraversableLike.map$(TraversableLike.scala:226) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.AbstractTraversable.map(Traversable.scala:104) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.scala:159)
 ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.translate(TableEnvironmentImpl.java:1329)
 ~[flink-table_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:676)
 ~[flink-table_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.executeOperation(TableEnvironmentImpl.java:767)
 ~[flink-table_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:666)
 ~[flink-table_2.12-1.12.1.jar:1.12.1]
        at 
com.huya.dc.walrus.lakehouse.flink.sql.FlinkSQLSubmitter.executeSQL(FlinkSQLSubmitter.java:156)
 ~[?:?]
        at 
com.huya.dc.walrus.lakehouse.flink.sql.FlinkSQLSubmitter.main(FlinkSQLSubmitter.java:112)
 ~[?:?]
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
~[?:1.8.0_112]
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
~[?:1.8.0_112]
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 ~[?:1.8.0_112]
        at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
        at 
org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:349)
 ~[flink-dist_2.12-1.12.1.jar:1.12.1]
        ... 11 more
   Caused by: java.lang.NoSuchMethodException: 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(org.apache.hadoop.hive.conf.HiveConf,
 java.lang.Boolean)
        at java.lang.Class.getConstructor0(Class.java:3082) ~[?:1.8.0_112]
        at java.lang.Class.getDeclaredConstructor(Class.java:2178) 
~[?:1.8.0_112]
        at 
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1650)
 ~[?:?]
        at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.(RetryingMetaStoreClient.java:83)
 ~[?:?]
        at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:133)
 ~[?:?]
        at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:89)
 ~[?:?]
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
~[?:1.8.0_112]
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
~[?:1.8.0_112]
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 ~[?:1.8.0_112]
        at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
        at 
org.apache.iceberg.common.DynMethods$UnboundMethod.invokeChecked(DynMethods.java:65)
 ~[?:?]
        at 
org.apache.iceberg.common.DynMethods$UnboundMethod.invoke(DynMethods.java:77) 
~[?:?]
        at 
org.apache.iceberg.common.DynMethods$StaticMethod.invoke(DynMethods.java:196) 
~[?:?]
        at 
org.apache.iceberg.hive.HiveClientPool.newClient(HiveClientPool.java:56) ~[?:?]
        at 
org.apache.iceberg.hive.HiveClientPool.newClient(HiveClientPool.java:33) ~[?:?]
        at org.apache.iceberg.ClientPoolImpl.get(ClientPoolImpl.java:125) ~[?:?]
        at org.apache.iceberg.ClientPoolImpl.run(ClientPoolImpl.java:56) ~[?:?]
        at org.apache.iceberg.ClientPoolImpl.run(ClientPoolImpl.java:51) ~[?:?]
        at 
org.apache.iceberg.hive.CachedClientPool.run(CachedClientPool.java:76) ~[?:?]
        at 
org.apache.iceberg.hive.HiveCatalog.loadNamespaceMetadata(HiveCatalog.java:386) 
~[?:?]
        at 
org.apache.iceberg.flink.FlinkCatalog.getDatabase(FlinkCatalog.java:173) ~[?:?]
        at 
org.apache.iceberg.flink.FlinkCatalog.databaseExists(FlinkCatalog.java:185) 
~[?:?]
        at 
org.apache.iceberg.flink.FlinkDynamicTableFactory.createTableLoader(FlinkDynamicTableFactory.java:163)
 ~[?:?]
        at 
org.apache.iceberg.flink.FlinkDynamicTableFactory.createDynamicTableSink(FlinkDynamicTableFactory.java:112)
 ~[?:?]
        at 
org.apache.flink.table.factories.FactoryUtil.createTableSink(FactoryUtil.java:153)
 ~[flink-table_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.table.planner.delegation.PlannerBase.getTableSink(PlannerBase.scala:369)
 ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.table.planner.delegation.PlannerBase.translateToRel(PlannerBase.scala:221)
 ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.table.planner.delegation.PlannerBase.$anonfun$translate$1(PlannerBase.scala:159)
 ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
        at 
scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:233) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.Iterator.foreach(Iterator.scala:937) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.Iterator.foreach$(Iterator.scala:937) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.AbstractIterator.foreach(Iterator.scala:1425) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.IterableLike.foreach(IterableLike.scala:70) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.IterableLike.foreach$(IterableLike.scala:69) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.AbstractIterable.foreach(Iterable.scala:54) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.TraversableLike.map(TraversableLike.scala:233) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.TraversableLike.map$(TraversableLike.scala:226) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at scala.collection.AbstractTraversable.map(Traversable.scala:104) 
~[flink-dist_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.scala:159)
 ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.translate(TableEnvironmentImpl.java:1329)
 ~[flink-table_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:676)
 ~[flink-table_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.executeOperation(TableEnvironmentImpl.java:767)
 ~[flink-table_2.12-1.12.1.jar:1.12.1]
        at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:666)
 ~[flink-table_2.12-1.12.1.jar:1.12.1]
        at 
com.huya.dc.walrus.lakehouse.flink.sql.FlinkSQLSubmitter.executeSQL(FlinkSQLSubmitter.java:156)
 ~[?:?]
        at 
com.huya.dc.walrus.lakehouse.flink.sql.FlinkSQLSubmitter.main(FlinkSQLSubmitter.java:112)
 ~[?:?]
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
~[?:1.8.0_112]
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
~[?:1.8.0_112]
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 ~[?:1.8.0_112]
        at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
        at 
org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:349)
 ~[flink-dist_2.12-1.12.1.jar:1.12.1]
        ... 11 more
   ```
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to