imback82 commented on a change in pull request #35173:
URL: https://github.com/apache/spark/pull/35173#discussion_r782598898
##########
File path:
sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala
##########
@@ -220,15 +220,14 @@ private[spark] class HiveExternalCatalog(conf: SparkConf,
hadoopConf: Configurat
override def dropDatabase(
db: String,
ignoreIfNotExists: Boolean,
- cascade: Boolean): Unit = withClient {
- try {
- client.dropDatabase(db, ignoreIfNotExists, cascade)
- } catch {
- case NonFatal(exception) =>
- if
(exception.getClass.getName.equals("org.apache.hadoop.hive.ql.metadata.HiveException")
- && exception.getMessage.contains(s"Database $db is not empty.")) {
- throw QueryCompilationErrors.cannotDropNonemptyDatabaseError(db)
- } else throw exception
+ cascade: Boolean): Unit = withClientWrappingException {
+ client.dropDatabase(db, ignoreIfNotExists, cascade)
+ } { exception =>
+ if
(exception.getClass.getName.equals("org.apache.hadoop.hive.ql.metadata.HiveException")
+ && exception.getMessage.contains(s"Database $db is not empty")) {
Review comment:
@cloud-fan Note that I removed `.` when checking the message.
For Hive 0.12, the exception message is
`InvalidOperationException(message:Database temporary is not empty)`, whereas
for Hive >0.12, the message is `[InvalidOperationException(message:Database
temporary is not empty. One or more tables exist.)]`. So the wrapping wasn't
working for Hive 0.12.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]