sunchao commented on a change in pull request #31364:
URL: https://github.com/apache/spark/pull/31364#discussion_r566972193
##########
File path:
sql/core/src/main/scala/org/apache/spark/sql/internal/CatalogImpl.scala
##########
@@ -515,14 +515,20 @@ class CatalogImpl(sparkSession: SparkSession) extends
Catalog {
}
/**
- * Invalidates and refreshes all the cached data and metadata of the given
table or view.
- * For Hive metastore table, the metadata is refreshed. For data source
tables, the schema will
- * not be inferred and refreshed.
+ * The method fully refreshes a table or view with the given name including:
+ * 1. The relation cache in the session catalog. The method removes table
entry from the cache.
+ * 2. The file indexes of all relations used by the given view.
+ * 3. Table/View schema in the Hive Metastore if the SQL config
+ * `spark.sql.hive.caseSensitiveInferenceMode` is set to
`INFER_AND_SAVE`.
+ * 4. Cached data of the given table or view, and all its dependents that
refer to it.
+ * Existing cached data will be cleared and the cache will be lazily
filled when
+ * the next time the table/view or the dependents are accessed.
*
- * If this table is cached as an InMemoryRelation, re-cache the table and
its dependents lazily.
+ * The method does not do:
+ * - schema inference for file source tables
+ * - statistics update
*
- * In addition, refreshing a table also clear all caches that have reference
to the table
- * in a cascading manner. This is to prevent incorrect result from the
otherwise staled caches.
+ * The method is supposed to use in all cases when need to refresh
table/view data and meta-data.
Review comment:
nit: supposed to use -> supposed to be used
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]