sap1ens commented on a change in pull request #28852:
URL: https://github.com/apache/spark/pull/28852#discussion_r446422811
##########
File path:
sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveMetadataCacheSuite.scala
##########
@@ -126,4 +129,39 @@ class HiveMetadataCacheSuite extends QueryTest with
SQLTestUtils with TestHiveSi
for (pruningEnabled <- Seq(true, false)) {
testCaching(pruningEnabled)
}
+
+ test("cache TTL") {
+ val sparkConfWithTTl = new SparkConf().set(SQLConf.METADATA_CACHE_TTL.key,
"1")
+ val newSession =
SparkSession.builder.config(sparkConfWithTTl).getOrCreate().cloneSession()
+
+ withSparkSession(newSession) { implicit spark =>
Review comment:
I see that `withSQLConf ` is overridden here
https://github.com/apache/spark/blob/master/sql/core/src/test/scala/org/apache/spark/sql/internal/ExecutorSideSQLConfSuite.scala#L59-L68
and it works with `StaticSQLConf`:
https://github.com/apache/spark/blob/master/sql/core/src/test/scala/org/apache/spark/sql/internal/ExecutorSideSQLConfSuite.scala#L59-L68
It also works If I implement something similar in my test, but it still
feels hacky. Do you think we can go with this plan? In this case, the test in
the `HiveMetadataCacheSuite` should be in its own suite / class.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]