Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/20940#discussion_r180110683
--- Diff: core/src/main/scala/org/apache/spark/executor/Executor.scala ---
@@ -772,6 +772,12 @@ private[spark] class Executor(
val accumUpdates = new ArrayBuffer[(Long, Seq[AccumulatorV2[_, _]])]()
val curGCTime = computeTotalGcTime()
+ // get executor level memory metrics
+ val executorUpdates = new ExecutorMetrics(System.currentTimeMillis(),
+ ManagementFactory.getMemoryMXBean.getHeapMemoryUsage().getUsed(),
--- End diff --
I also think its totally fine to not have every metric possible now, but if
this one is easy to add here, it would be nice. In particular I'm thinking
we'd also like to capture the memory associated with python if its a pyspark
app, though that is significantly more complicated so we don't need to do that
now.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]