srowen commented on a change in pull request #26543: [SPARK-29911][SQL] Uncache 
cached tables when session closed
URL: https://github.com/apache/spark/pull/26543#discussion_r347976141
 
 

 ##########
 File path: 
sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLSessionManager.scala
 ##########
 @@ -75,6 +75,9 @@ private[hive] class SparkSQLSessionManager(hiveServer: 
HiveServer2, sqlContext:
 
   override def closeSession(sessionHandle: SessionHandle): Unit = {
     
HiveThriftServer2.listener.onSessionClosed(sessionHandle.getSessionId.toString)
+    val ctx = 
sparkSqlOperationManager.sessionToContexts.getOrDefault(sessionHandle, 
sqlContext)
+    
ctx.sparkSession.sessionState.catalog.getTempViewNames().foreach(ctx.uncacheTable)
 
 Review comment:
   @cloud-fan is that a somewhat different point? I'm not an expert here. Seems 
like this is about unpersisting cached temp views in the session when the 
session is done, not stopping Spark when all active sessions are closed. As in, 
is it valid to re-start a new session in the same context?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to