viirya commented on a change in pull request #30187:
URL: https://github.com/apache/spark/pull/30187#discussion_r514884174



##########
File path: 
sql/core/src/main/scala/org/apache/spark/sql/internal/CatalogImpl.scala
##########
@@ -524,14 +524,17 @@ class CatalogImpl(sparkSession: SparkSession) extends 
Catalog {
     // If this table is cached as an InMemoryRelation, drop the original
     // cached version and make the new version cached lazily.
     val cache = sparkSession.sharedState.cacheManager.lookupCachedData(table)
+
+    // uncache the logical plan.
+    // note this is a no-op for the table itself if it's not cached, but will 
invalidate all
+    // caches referencing this table.
+    sparkSession.sharedState.cacheManager.uncacheQuery(table, cascade = true)
+
     if (cache.nonEmpty) {
       // save the cache name and cache level for recreation
       val cacheName = cache.get.cachedRepresentation.cacheBuilder.tableName
       val cacheLevel = cache.get.cachedRepresentation.cacheBuilder.storageLevel
 
-      // uncache the logical plan.
-      sparkSession.sharedState.cacheManager.uncacheQuery(table, cascade = true)
-
       // recache with the same name and cache level.
       sparkSession.sharedState.cacheManager.cacheQuery(table, cacheName, 
cacheLevel)

Review comment:
       Hmm, we will recache the table lazily. But for other caches referencing 
this table, we just uncache them and don't recache them. Does it sound 
inconsistent?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to