[ 
https://issues.apache.org/jira/browse/HIVE-22052?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Janus Chow updated HIVE-22052:
------------------------------
    Environment: 
Hive : 1.2.2 (4 metastores)
mysql : 5.7.21-20
jdbc : mysql-connector-java-8.0.17.jar

 

  was:
Hive : 1.2.2 (4 metastores)
 mysql :
{code:java}
+-------------------------+--------------------------------------------------------+
| Variable_name           | Value                                               
   |
+-------------------------+--------------------------------------------------------+
| innodb_version          | 5.7.21-20                                           
   |
| protocol_version        | 10                                                  
   |
| slave_type_conversions  |                                                     
   |
| tls_version             | TLSv1,TLSv1.1,TLSv1.2                               
   |
| version                 | 5.7.21-20-log                                       
   |
| version_comment         | Percona Server (GPL), Release 20, Revision 
ed217b06ca3 |
| version_compile_machine | x86_64                                              
   |
| version_compile_os      | Linux                                               
   |
| version_suffix          | -log                                                
   |
+-------------------------+--------------------------------------------------------+
{code}
 jdbc : mysql-connector-java-8.0.17.jar

 


> Can not drop table with Hive Metastore
> --------------------------------------
>
>                 Key: HIVE-22052
>                 URL: https://issues.apache.org/jira/browse/HIVE-22052
>             Project: Hive
>          Issue Type: Bug
>          Components: Metastore
>    Affects Versions: 1.2.2
>         Environment: Hive : 1.2.2 (4 metastores)
> mysql : 5.7.21-20
> jdbc : mysql-connector-java-8.0.17.jar
>  
>            Reporter: Janus Chow
>            Priority: Major
>              Labels: metastore
>         Attachments: hive_metastore_drop_table_error
>
>
> When trying to drop a table, met the following error:
>  
> {code:java}
> // code placeholder
> 2019-07-25 23:56:35,601 ERROR [pool-4-thread-199]: 
> metastore.RetryingHMSHandler (RetryingHMSHandler.java:invoke(173)) - Retrying 
> HMSHandler after 2000 ms (attempt 1 of 10) with error: 
> javax.jdo.JDOUserException: One or more instances could not be deleted
>       at 
> org.datanucleus.api.jdo.JDOPersistenceManager.deletePersistentAll(JDOPersistenceManager.java:851)
>       at 
> org.datanucleus.api.jdo.JDOPersistenceManager.deletePersistentAll(JDOPersistenceManager.java:830)
>       at 
> org.apache.hadoop.hive.metastore.ObjectStore.dropTable(ObjectStore.java:911)
>       at sun.reflect.GeneratedMethodAccessor63.invoke(Unknown Source)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:498)
>       at 
> org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:114)
>       at com.sun.proxy.$Proxy6.dropTable(Unknown Source)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_core(HiveMetaStore.java:1535)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_with_environment_context(HiveMetaStore.java:1737)
>       at sun.reflect.GeneratedMethodAccessor64.invoke(Unknown Source)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:498)
>       at 
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107)
>       at com.sun.proxy.$Proxy8.drop_table_with_environment_context(Unknown 
> Source)
>       at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:9256)
>       at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:9240)
>       at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>       at 
> org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110)
>       at 
> org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:106)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:422)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1893)
>       at 
> org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118)
>       at 
> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>       at java.lang.Thread.run(Thread.java:748)
> NestedThrowablesStackTrace:
> Clear request failed : DELETE FROM `PARTITION_KEYS` WHERE `TBL_ID`=?
> org.datanucleus.exceptions.NucleusDataStoreException: Clear request failed : 
> DELETE FROM `PARTITION_KEYS` WHERE `TBL_ID`=?
>       at 
> org.datanucleus.store.rdbms.scostore.ElementContainerStore.executeClear(ElementContainerStore.java:566)
>       at 
> org.datanucleus.store.rdbms.scostore.ElementContainerStore.clear(ElementContainerStore.java:401)
>       at org.datanucleus.store.types.backed.List.clear(List.java:861)
>       at 
> org.datanucleus.store.rdbms.mapping.java.CollectionMapping.preDelete(CollectionMapping.java:295)
>       at 
> org.datanucleus.store.rdbms.request.DeleteRequest.execute(DeleteRequest.java:192)
>       at 
> org.datanucleus.store.rdbms.RDBMSPersistenceHandler.deleteTable(RDBMSPersistenceHandler.java:508)
>       at 
> org.datanucleus.store.rdbms.RDBMSPersistenceHandler.deleteObject(RDBMSPersistenceHandler.java:479)
>       at 
> org.datanucleus.state.AbstractStateManager.internalDeletePersistent(AbstractStateManager.java:822)
>       at 
> org.datanucleus.state.JDOStateManager.deletePersistent(JDOStateManager.java:4685)
>       at 
> org.datanucleus.ExecutionContextImpl.deleteObjectInternal(ExecutionContextImpl.java:2544)
>       at 
> org.datanucleus.ExecutionContextImpl.deleteObjectWork(ExecutionContextImpl.java:2466)
>       at 
> org.datanucleus.ExecutionContextImpl.deleteObjects(ExecutionContextImpl.java:2353)
>       at 
> org.datanucleus.ExecutionContextThreadedImpl.deleteObjects(ExecutionContextThreadedImpl.java:259)
>       at 
> org.datanucleus.api.jdo.JDOPersistenceManager.deletePersistentAll(JDOPersistenceManager.java:846)
>       at 
> org.datanucleus.api.jdo.JDOPersistenceManager.deletePersistentAll(JDOPersistenceManager.java:830)
>       at 
> org.apache.hadoop.hive.metastore.ObjectStore.dropTable(ObjectStore.java:911)
>       at sun.reflect.GeneratedMethodAccessor63.invoke(Unknown Source)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:498)
>       at 
> org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:114)
>       at com.sun.proxy.$Proxy6.dropTable(Unknown Source)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_core(HiveMetaStore.java:1535)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_with_environment_context(HiveMetaStore.java:1737)
>       at sun.reflect.GeneratedMethodAccessor64.invoke(Unknown Source)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:498)
>       at 
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107)
>       at com.sun.proxy.$Proxy8.drop_table_with_environment_context(Unknown 
> Source)
>       at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:9256)
>       at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:9240)
>       at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>       at 
> org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110)
>       at 
> org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:106)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:422)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1893)
>       at 
> org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118)
>       at 
> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>       at java.lang.Thread.run(Thread.java:748)
> Caused by: java.sql.BatchUpdateException: Cannot delete or update a parent 
> row: a foreign key constraint fails ("shopee_data_hive_db"."COLUMNS_V2", 
> CONSTRAINT "COLUMNS_V2_FK1" FOREIGN KEY ("CD_ID") REFERENCES "CDS" ("CD_ID"))
>       at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>       at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>       at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>       at com.mysql.cj.util.Util.handleNewInstance(Util.java:192)
>       at com.mysql.cj.util.Util.getInstance(Util.java:167)
>       at com.mysql.cj.util.Util.getInstance(Util.java:174)
>       at 
> com.mysql.cj.jdbc.exceptions.SQLError.createBatchUpdateException(SQLError.java:224)
>       at 
> com.mysql.cj.jdbc.ClientPreparedStatement.executeBatchSerially(ClientPreparedStatement.java:853)
>       at 
> com.mysql.cj.jdbc.ClientPreparedStatement.executeBatchInternal(ClientPreparedStatement.java:435)
>       at com.mysql.cj.jdbc.StatementImpl.executeBatch(StatementImpl.java:796)
>       at 
> com.jolbox.bonecp.StatementHandle.executeBatch(StatementHandle.java:424)
>       at 
> org.datanucleus.store.rdbms.ParamLoggingPreparedStatement.executeBatch(ParamLoggingPreparedStatement.java:372)
>       at 
> org.datanucleus.store.rdbms.SQLController.processConnectionStatement(SQLController.java:628)
>       at 
> org.datanucleus.store.rdbms.SQLController.getStatementForUpdate(SQLController.java:207)
>       at 
> org.datanucleus.store.rdbms.SQLController.getStatementForUpdate(SQLController.java:179)
>       at 
> org.datanucleus.store.rdbms.scostore.ElementContainerStore.executeClear(ElementContainerStore.java:542)
>       ... 40 more
> Caused by: java.sql.SQLIntegrityConstraintViolationException: Cannot delete 
> or update a parent row: a foreign key constraint fails 
> ("shopee_data_hive_db"."COLUMNS_V2", CONSTRAINT "COLUMNS_V2_FK1" FOREIGN KEY 
> ("CD_ID") REFERENCES "CDS" ("CD_ID"))
>       at 
> com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:117)
>       at 
> com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:97)
>       at 
> com.mysql.cj.jdbc.exceptions.SQLExceptionsMapping.translateException(SQLExceptionsMapping.java:122)
>       at 
> com.mysql.cj.jdbc.ClientPreparedStatement.executeInternal(ClientPreparedStatement.java:953)
>       at 
> com.mysql.cj.jdbc.ClientPreparedStatement.executeUpdateInternal(ClientPreparedStatement.java:1092)
>       at 
> com.mysql.cj.jdbc.ClientPreparedStatement.executeBatchSerially(ClientPreparedStatement.java:832)
>       ... 48 more
> Nested Throwables StackTrace:
> java.sql.BatchUpdateException: Cannot delete or update a parent row: a 
> foreign key constraint fails ("shopee_data_hive_db"."COLUMNS_V2", CONSTRAINT 
> "COLUMNS_V2_FK1" FOREIGN KEY ("CD_ID") REFERENCES "CDS" ("CD_ID"))
>       at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>       at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>       at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>       at com.mysql.cj.util.Util.handleNewInstance(Util.java:192)
>       at com.mysql.cj.util.Util.getInstance(Util.java:167)
>       at com.mysql.cj.util.Util.getInstance(Util.java:174)
>       at 
> com.mysql.cj.jdbc.exceptions.SQLError.createBatchUpdateException(SQLError.java:224)
>       at 
> com.mysql.cj.jdbc.ClientPreparedStatement.executeBatchSerially(ClientPreparedStatement.java:853)
>       at 
> com.mysql.cj.jdbc.ClientPreparedStatement.executeBatchInternal(ClientPreparedStatement.java:435)
>       at com.mysql.cj.jdbc.StatementImpl.executeBatch(StatementImpl.java:796)
>       at 
> com.jolbox.bonecp.StatementHandle.executeBatch(StatementHandle.java:424)
>       at 
> org.datanucleus.store.rdbms.ParamLoggingPreparedStatement.executeBatch(ParamLoggingPreparedStatement.java:372)
>       at 
> org.datanucleus.store.rdbms.SQLController.processConnectionStatement(SQLController.java:628)
>       at 
> org.datanucleus.store.rdbms.SQLController.getStatementForUpdate(SQLController.java:207)
>       at 
> org.datanucleus.store.rdbms.SQLController.getStatementForUpdate(SQLController.java:179)
>       at 
> org.datanucleus.store.rdbms.scostore.ElementContainerStore.executeClear(ElementContainerStore.java:542)
>       at 
> org.datanucleus.store.rdbms.scostore.ElementContainerStore.clear(ElementContainerStore.java:401)
>       at org.datanucleus.store.types.backed.List.clear(List.java:861)
>       at 
> org.datanucleus.store.rdbms.mapping.java.CollectionMapping.preDelete(CollectionMapping.java:295)
>       at 
> org.datanucleus.store.rdbms.request.DeleteRequest.execute(DeleteRequest.java:192)
>       at 
> org.datanucleus.store.rdbms.RDBMSPersistenceHandler.deleteTable(RDBMSPersistenceHandler.java:508)
>       at 
> org.datanucleus.store.rdbms.RDBMSPersistenceHandler.deleteObject(RDBMSPersistenceHandler.java:479)
>       at 
> org.datanucleus.state.AbstractStateManager.internalDeletePersistent(AbstractStateManager.java:822)
>       at 
> org.datanucleus.state.JDOStateManager.deletePersistent(JDOStateManager.java:4685)
>       at 
> org.datanucleus.ExecutionContextImpl.deleteObjectInternal(ExecutionContextImpl.java:2544)
>       at 
> org.datanucleus.ExecutionContextImpl.deleteObjectWork(ExecutionContextImpl.java:2466)
>       at 
> org.datanucleus.ExecutionContextImpl.deleteObjects(ExecutionContextImpl.java:2353)
>       at 
> org.datanucleus.ExecutionContextThreadedImpl.deleteObjects(ExecutionContextThreadedImpl.java:259)
>       at 
> org.datanucleus.api.jdo.JDOPersistenceManager.deletePersistentAll(JDOPersistenceManager.java:846)
>       at 
> org.datanucleus.api.jdo.JDOPersistenceManager.deletePersistentAll(JDOPersistenceManager.java:830)
>       at 
> org.apache.hadoop.hive.metastore.ObjectStore.dropTable(ObjectStore.java:911)
>       at sun.reflect.GeneratedMethodAccessor63.invoke(Unknown Source)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:498)
>       at 
> org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:114)
>       at com.sun.proxy.$Proxy6.dropTable(Unknown Source)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_core(HiveMetaStore.java:1535)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_with_environment_context(HiveMetaStore.java:1737)
>       at sun.reflect.GeneratedMethodAccessor64.invoke(Unknown Source)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:498)
>       at 
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107)
>       at com.sun.proxy.$Proxy8.drop_table_with_environment_context(Unknown 
> Source)
>       at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:9256)
>       at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:9240)
>       at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>       at 
> org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110)
>       at 
> org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:106)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:422)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1893)
>       at 
> org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118)
>       at 
> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>       at java.lang.Thread.run(Thread.java:748)
> Caused by: java.sql.SQLIntegrityConstraintViolationException: Cannot delete 
> or update a parent row: a foreign key constraint fails 
> ("shopee_data_hive_db"."COLUMNS_V2", CONSTRAINT "COLUMNS_V2_FK1" FOREIGN KEY 
> ("CD_ID") REFERENCES "CDS" ("CD_ID"))
>       at 
> com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:117)
>       at 
> com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:97)
>       at 
> com.mysql.cj.jdbc.exceptions.SQLExceptionsMapping.translateException(SQLExceptionsMapping.java:122)
>       at 
> com.mysql.cj.jdbc.ClientPreparedStatement.executeInternal(ClientPreparedStatement.java:953)
>       at 
> com.mysql.cj.jdbc.ClientPreparedStatement.executeUpdateInternal(ClientPreparedStatement.java:1092)
>       at 
> com.mysql.cj.jdbc.ClientPreparedStatement.executeBatchSerially(ClientPreparedStatement.java:832)
>       ... 48 more
> {code}
>  The table I'm trying to drop is just a temp table.
>  



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)

Reply via email to