viirya commented on a change in pull request #30699:
URL: https://github.com/apache/spark/pull/30699#discussion_r540601454
##########
File path:
sql/core/src/main/scala/org/apache/spark/sql/internal/CatalogImpl.scala
##########
@@ -538,8 +538,12 @@ class CatalogImpl(sparkSession: SparkSession) extends
Catalog {
val cacheName = cache.get.cachedRepresentation.cacheBuilder.tableName
val cacheLevel = cache.get.cachedRepresentation.cacheBuilder.storageLevel
+ // creates a new logical plan since the old table refers to old relation
which
+ // should be refreshed
+ val newTable = sparkSession.table(tableIdent)
+
// recache with the same name and cache level.
- sparkSession.sharedState.cacheManager.cacheQuery(table, cacheName,
cacheLevel)
+ sparkSession.sharedState.cacheManager.cacheQuery(newTable, cacheName,
cacheLevel)
Review comment:
Hmm, I see. `cacheTable` doesn't take a custom cache table name as
parameter so far.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]