mridulm commented on code in PR #36237:
URL: https://github.com/apache/spark/pull/36237#discussion_r852355201
##########
core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala:
##########
@@ -325,12 +325,18 @@ private[history] class FsHistoryProvider(conf: SparkConf,
clock: Clock)
override def getListing(): Iterator[ApplicationInfo] = {
// Return the listing in end time descending order.
- listing.view(classOf[ApplicationInfoWrapper])
+ // SPARK-38896: tryWithResource cannot be used here
+ // because this method needs to return an `Iterator`.
+ val closeableIter = listing.view(classOf[ApplicationInfoWrapper])
.index("endTime")
.reverse()
- .iterator()
- .asScala
- .map(_.toApplicationInfo())
+ .closeableIterator()
+ val dataIter = closeableIter.asScala.map(_.toApplicationInfo())
+ new Iterator[ApplicationInfo] with Closeable {
+ override def hasNext: Boolean = dataIter.hasNext
+ override def next(): ApplicationInfo = dataIter.next()
+ override def close(): Unit = closeableIter.close()
+ }
Review Comment:
Since not all usages of `getListing()` are draining the iterator, add a
`finalize` as well ?
Alternative is to generalize and ensure all iterators are closed within
history server (which are closeable).
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]