MaxGekk commented on a change in pull request #31066:
URL: https://github.com/apache/spark/pull/31066#discussion_r552899284
##########
File path:
sql/core/src/main/scala/org/apache/spark/sql/execution/command/ddl.scala
##########
@@ -675,7 +675,7 @@ case class AlterTableRecoverPartitionsCommand(
// This is always the case for Hive format tables, but is not true for
Datasource tables created
// before Spark 2.1 unless they are converted via `msck repair table`.
spark.sessionState.catalog.alterTable(table.copy(tracksPartitionsInCatalog
= true))
- catalog.refreshTable(tableName)
+ spark.catalog.refreshTable(tableIdentWithDB)
Review comment:
I wonder what are the use cases when we need to update only meta-data
but not cached table data. Looking at the places where
`SessionCatalog.refreshTable` is used:
1.
https://github.com/apache/spark/blob/c62b84a0432e51fd10e628088ee311dc3be73d2f/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/InsertIntoHiveTable.scala#L113
2.
https://github.com/apache/spark/blob/271c4f6e00b7bc7c47d84a8e59018e84a19c9822/sql/core/src/main/scala/org/apache/spark/sql/DataFrameWriter.scala#L729
3.
https://github.com/apache/spark/blob/2ab77d634f2e87b080786f4f39cb17e0994bc550/sql/core/src/main/scala/org/apache/spark/sql/execution/command/ddl.scala#L245
4.
https://github.com/apache/spark/blob/ddc0d5148ac6decde160cca847b5db5d6de1be58/sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala#L233
5.
https://github.com/apache/spark/blob/ddc0d5148ac6decde160cca847b5db5d6de1be58/sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala#L394
In all those ^^ places, updating of cached table data makes sense, IMHO.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]