openinx opened a new issue #2575:
URL: https://github.com/apache/iceberg/issues/2575


   I encountered the flakey unit tests several times: 
   
   ```
   org.apache.iceberg.flink.TestFlinkTableSink > 
testHashDistributeMode[catalogName=testhive, baseNamespace=, format=AVRO, 
isStreaming=true] FAILED
       java.lang.AssertionError: There should be only 1 data file in partition 
'aaa' expected:<1> but was:<2>
           at org.junit.Assert.fail(Assert.java:88)
           at org.junit.Assert.failNotEquals(Assert.java:834)
           at org.junit.Assert.assertEquals(Assert.java:645)
           at 
org.apache.iceberg.flink.TestFlinkTableSink.testHashDistributeMode(TestFlinkTableSink.java:274)
   
       org.apache.flink.table.api.ValidationException: Could not execute DROP 
DATABASE IF EXISTS  testhive.db RESTRICT
           at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.executeOperation(TableEnvironmentImpl.java:989)
           at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:666)
           at org.apache.iceberg.flink.FlinkTestBase.exec(FlinkTestBase.java:91)
           at org.apache.iceberg.flink.FlinkTestBase.exec(FlinkTestBase.java:95)
           at org.apache.iceberg.flink.FlinkTestBase.sql(FlinkTestBase.java:99)
           at 
org.apache.iceberg.flink.TestFlinkTableSink.clean(TestFlinkTableSink.java:126)
   
           Caused by:
           org.apache.flink.table.catalog.exceptions.DatabaseNotEmptyException: 
Database db in catalog testhive is not empty.
               at 
org.apache.iceberg.flink.FlinkCatalog.dropDatabase(FlinkCatalog.java:240)
               at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.executeOperation(TableEnvironmentImpl.java:983)
               ... 5 more
   
               Caused by:
               org.apache.iceberg.exceptions.NamespaceNotEmptyException: 
Namespace db is not empty. One or more tables exist.
                   at 
org.apache.iceberg.hive.HiveCatalog.dropNamespace(HiveCatalog.java:307)
                   at 
org.apache.iceberg.flink.FlinkCatalog.dropDatabase(FlinkCatalog.java:231)
                   ... 6 more
   
                   Caused by:
                   InvalidOperationException(message:Database db is not empty. 
One or more tables exist.)
                       at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$drop_database_result$drop_database_resultStandardScheme.read(ThriftHiveMetastore.java:28714)
                       at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$drop_database_result$drop_database_resultStandardScheme.read(ThriftHiveMetastore.java:28691)
                       at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$drop_database_result.read(ThriftHiveMetastore.java:28625)
                       at 
org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:86)
                       at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_drop_database(ThriftHiveMetastore.java:813)
                       at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.drop_database(ThriftHiveMetastore.java:798)
                       at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropDatabase(HiveMetaStoreClient.java:868)
                       at 
org.apache.iceberg.hive.HiveCatalog.lambda$dropNamespace$9(HiveCatalog.java:296)
                       at 
org.apache.iceberg.ClientPoolImpl.run(ClientPoolImpl.java:51)
                       at 
org.apache.iceberg.hive.CachedClientPool.run(CachedClientPool.java:77)
                       at 
org.apache.iceberg.hive.HiveCatalog.dropNamespace(HiveCatalog.java:295)
                       ... 7 more
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to