cloud-fan commented on a change in pull request #33137:
URL: https://github.com/apache/spark/pull/33137#discussion_r660909444



##########
File path: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/ddl.scala
##########
@@ -675,7 +675,15 @@ case class RepairTableCommand(
     // This is always the case for Hive format tables, but is not true for 
Datasource tables created
     // before Spark 2.1 unless they are converted via `msck repair table`.
     spark.sessionState.catalog.alterTable(table.copy(tracksPartitionsInCatalog 
= true))
-    spark.catalog.refreshTable(tableIdentWithDB)
+    try {
+      spark.catalog.refreshTable(tableIdentWithDB)
+    } catch {
+      case NonFatal(_) =>
+        logError(s"Cannot refresh the table '$tableIdentWithDB'. A query of 
the table " +
+          "might return wrong result if the table was cached. To avoid such 
issue, you should " +
+          "uncache the table manually via the UNCACHE TABLE command after 
table recovering will " +
+          "complete fully.")

Review comment:
       shall we also log the exception itself?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to