William1104 commented on a change in pull request #24221: [SPARK-27248][SQL]
`refreshTable` should recreate cache with same cache name and storage level
URL: https://github.com/apache/spark/pull/24221#discussion_r282571083
##########
File path: sql/core/src/test/scala/org/apache/spark/sql/test/SQLTestUtils.scala
##########
@@ -299,6 +299,28 @@ private[sql] trait SQLTestUtilsBase
}
}
+ /**
+ * Drops cache `cacheName` after calling `f`.
+ */
+ protected def withCache(cacheNames: String*)(f: => Unit): Unit = {
+ try f catch {
+ case cause: Throwable => throw cause
+ } finally {
+ cacheNames.foreach(uncacheTable)
Review comment:
The error was because I didn't make the table creation SQL correct and
therefore the `uncacheTable` operation failed. Somehow, the exception thrown by
`uncacheTable` masked the true failure..
The `withGlobalTempView` function does it in a better way by explicitly
ignoring `NoSuchTableException` in the finally clause.
```
try f finally {
// If the test failed part way, we don't want to mask the failure by
failing to remove
// global temp views that never got created.
try viewNames.foreach(spark.catalog.dropGlobalTempView) catch {
case _: NoSuchTableException =>
}
}
```
I would like to enhance `withCache` function in a similar way, but not sure
what exception we should ignore. In Scala, do we have anything similar to
java's suppress exception? Exception thrown by 'close()' method in a
try-with-resource block will be added back to the original exception as a
suppressed exception..
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]