mridulm commented on code in PR #36162:
URL: https://github.com/apache/spark/pull/36162#discussion_r862546499
##########
core/src/main/scala/org/apache/spark/SparkStatusTracker.scala:
##########
@@ -120,4 +120,8 @@ class SparkStatusTracker private[spark] (sc: SparkContext,
store: AppStatusStore
exec.memoryMetrics.map(_.totalOnHeapStorageMemory).getOrElse(0L))
}.toArray
}
+
+ def getAppStatusStore: AppStatusStore = {
+ store
+ }
Review Comment:
This is a good point - for running tasks, currently we are not updating TSM
with the details - since scheduler did not care for this, until now.
We can introduce the specific subset of metrics we want to track, and update
TaskInfo with it - private to scheduler package.
Thoughts ?
+CC @Ngone51
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]