Github user squito commented on a diff in the pull request:
    --- Diff: core/src/main/scala/org/apache/spark/executor/Executor.scala ---
    @@ -772,6 +772,12 @@ private[spark] class Executor(
         val accumUpdates = new ArrayBuffer[(Long, Seq[AccumulatorV2[_, _]])]()
         val curGCTime = computeTotalGcTime()
    +    // get executor level memory metrics
    +    val executorUpdates = new ExecutorMetrics(System.currentTimeMillis(),
    +      ManagementFactory.getMemoryMXBean.getHeapMemoryUsage().getUsed(),
    --- End diff --
    I also think its totally fine to not have every metric possible now, but if 
this one is easy to add here, it would be nice.  In particular I'm thinking 
we'd also like to capture the memory associated with python if its a pyspark 
app, though that is significantly more complicated so we don't need to do that 


To unsubscribe, e-mail:
For additional commands, e-mail:

Reply via email to