dilipbiswal commented on a change in pull request #23905: [SPARK-24669][SQL]
Refresh table before drop database cascade
URL: https://github.com/apache/spark/pull/23905#discussion_r260636258
##########
File path:
sql/core/src/main/scala/org/apache/spark/sql/execution/command/ddl.scala
##########
@@ -102,6 +102,12 @@ case class DropDatabaseCommand(
extends RunnableCommand {
override def run(sparkSession: SparkSession): Seq[Row] = {
+ val catalog = sparkSession.sessionState.catalog
+ if (cascade) {
+ catalog.listTables(databaseName).foreach { t =>
+ catalog.refreshTable(t)
+ }
+ }
sparkSession.sessionState.catalog.dropDatabase(databaseName, ifExists,
cascade)
Review comment:
On this call we can get an error. For example, its not allowed to drop the
default database. In that case, even though we are not going to go ahead with
the drop database action, we will end up refreshing all the tables inside it ?
Is that expected ?
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]