[ 
https://issues.apache.org/jira/browse/HIVE-21706?focusedWorklogId=240819&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-240819
 ]

ASF GitHub Bot logged work on HIVE-21706:
-----------------------------------------

                Author: ASF GitHub Bot
            Created on: 13/May/19 03:13
            Start Date: 13/May/19 03:13
    Worklog Time Spent: 10m 
      Work Description: sankarh commented on pull request #620: HIVE-21706: 
REPL Dump with concurrent drop of external table fails with 
InvalidTableException.
URL: https://github.com/apache/hive/pull/620#discussion_r283176637
 
 

 ##########
 File path: ql/src/java/org/apache/hadoop/hive/ql/exec/repl/ReplDumpTask.java
 ##########
 @@ -256,19 +256,25 @@ private Long incrementalDump(Path dumpRoot, DumpMetaData 
dmd, Path cmRoot, Hive
 
       try (Writer writer = new Writer(dumpRoot, conf)) {
         for (String tableName : Utils.matchesTbl(hiveDb, dbName, 
work.tableNameOrPattern)) {
-          Table table = hiveDb.getTable(dbName, tableName);
+          try {
+            Table table = hiveDb.getTable(dbName, tableName);
 
-          // Dump external table locations if required.
-          if (shouldDumpExternalTableLocation() &&
-                  TableType.EXTERNAL_TABLE.equals(table.getTableType())) {
-            writer.dataLocationDump(table);
-          }
+            // Dump external table locations if required.
+            if (shouldDumpExternalTableLocation() &&
+                    TableType.EXTERNAL_TABLE.equals(table.getTableType())) {
+              writer.dataLocationDump(table);
+            }
 
-          // Dump the table to be bootstrapped if required.
-          if (shouldBootstrapDumpTable(table)) {
-            HiveWrapper.Tuple<Table> tableTuple = new HiveWrapper(hiveDb, 
dbName).table(table);
-            dumpTable(dbName, tableName, validTxnList, dbRoot, 
bootDumpBeginReplId, hiveDb,
-                    tableTuple);
+            // Dump the table to be bootstrapped if required.
+            if (shouldBootstrapDumpTable(table)) {
+              HiveWrapper.Tuple<Table> tableTuple = new HiveWrapper(hiveDb, 
dbName).table(table);
+              dumpTable(dbName, tableName, validTxnList, dbRoot, 
bootDumpBeginReplId, hiveDb,
+                      tableTuple);
+            }
+          } catch (InvalidTableException te) {
+            // Repl dump shouldn't fail if the table is dropped/renamed while 
dumping it.
+            // Just log a debug message and skip it.
+            LOG.debug(te.getMessage());
 
 Review comment:
   In bootstrap dump is is debug log... also, the exception message have the 
table name. do you think, we need to change that as well to info?
   
 
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


Issue Time Tracking
-------------------

    Worklog Id:     (was: 240819)
    Time Spent: 40m  (was: 0.5h)

> REPL Dump with concurrent drop of external table fails with 
> InvalidTableException.
> ----------------------------------------------------------------------------------
>
>                 Key: HIVE-21706
>                 URL: https://issues.apache.org/jira/browse/HIVE-21706
>             Project: Hive
>          Issue Type: Bug
>          Components: HiveServer2, repl
>    Affects Versions: 4.0.0
>            Reporter: Sankar Hariappan
>            Assignee: Sankar Hariappan
>            Priority: Major
>              Labels: DR, pull-request-available, replication
>         Attachments: HIVE-21706.01.patch, HIVE-21706.02.patch
>
>          Time Spent: 40m
>  Remaining Estimate: 0h
>
> During REPL DUMP of a DB having external tables, if any of the external table 
> is dropped concurrently, then REPL DUMP fails with below exception.
> {code}
> 2019-05-10T06:29:52,092 ERROR [HiveServer2-Background-Pool: Thread-745399]: 
> repl.ReplDumpTask (:()) - failed
> org.apache.hadoop.hive.ql.metadata.InvalidTableException: Table not found 
> catalog_sales_new
>         at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:1383) 
> ~[hive-exec-3.1.0.3.1.0.31-12.jar:3.1.0.3.1.0.31-12]
>         at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:1336) 
> ~[hive-exec-3.1.0.3.1.0.31-12.jar:3.1.0.3.1.0.31-12]
>         at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:1316) 
> ~[hive-exec-3.1.0.3.1.0.31-12.jar:3.1.0.3.1.0.31-12]
>         at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:1298) 
> ~[hive-exec-3.1.0.3.1.0.31-12.jar:3.1.0.3.1.0.31-12]
>         at 
> org.apache.hadoop.hive.ql.exec.repl.ReplDumpTask.incrementalDump(ReplDumpTask.java:259)
>  ~[hive-exec-3.1.0.3.1.0.31-12.jar:3.1.0.3.1.0.31-12]
>         at 
> org.apache.hadoop.hive.ql.exec.repl.ReplDumpTask.execute(ReplDumpTask.java:121)
>  ~[hive-exec-3.1.0.3.1.0.31-12.jar:3.1.0.3.1.0.31-12]
>         at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:212) 
> ~[hive-exec-3.1.0.3.1.0.31-12.jar:3.1.0.3.1.0.31-12]
>         at 
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97) 
> ~[hive-exec-3.1.0.3.1.0.31-12.jar:3.1.0.3.1.0.31-12]
>         at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2711) 
> ~[hive-exec-3.1.0.3.1.0.31-12.jar:3.1.0.3.1.0.31-12]
>         at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2382) 
> ~[hive-exec-3.1.0.3.1.0.31-12.jar:3.1.0.3.1.0.31-12]
>         at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2054) 
> ~[hive-exec-3.1.0.3.1.0.31-12.jar:3.1.0.3.1.0.31-12]
>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1752) 
> ~[hive-exec-3.1.0.3.1.0.31-12.jar:3.1.0.3.1.0.31-12]
>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1746) 
> ~[hive-exec-3.1.0.3.1.0.31-12.jar:3.1.0.3.1.0.31-12]
>         at 
> org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157) 
> ~[hive-exec-3.1.0.3.1.0.31-12.jar:3.1.0.3.1.0.31-12]
>         at 
> org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:226)
>  ~[hive-service-3.1.0.3.1.0.31-12.jar:3.1.0.3.1.0.31-12]
>         at 
> org.apache.hive.service.cli.operation.SQLOperation.access$700(SQLOperation.java:87)
>  ~[hive-service-3.1.0.3.1.0.31-12.jar:3.1.0.3.1.0.31-12]
>         at 
> org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:324)
>  ~[hive-service-3.1.0.3.1.0.31-12.jar:3.1.0.3.1.0.31-12]
>         at java.security.AccessController.doPrivileged(Native Method) 
> ~[?:1.8.0_181]
>         at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_181]
>         at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
>  ~[hadoop-common-3.1.1.3.1.0.31-12.jar:?]
>         at 
> org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:342)
>  ~[hive-service-3.1.0.3.1.0.31-12.jar:3.1.0.3.1.0.31-12]
>         at 
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
> ~[?:1.8.0_181]
>         at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
> ~[?:1.8.0_181]
>         at 
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
> ~[?:1.8.0_181]
>         at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
> ~[?:1.8.0_181]
>         at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>  ~[?:1.8.0_181]
>         at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>  ~[?:1.8.0_181]
>         at java.lang.Thread.run(Thread.java:748) [?:1.8.0_181]
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to