[
https://issues.apache.org/jira/browse/SPARK-29273?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16939466#comment-16939466
]
huangweiyi commented on SPARK-29273:
------------------------------------
i do the same thing like shs for replaying the spark-event-log, when parsing
the SparkListenerTaskEnd, i print out some metrics value, here is the sniped
code
case taskEnd: SparkListenerTaskEnd => {
info(s"peakExecutionMemory: ${taskEnd.taskMetrics.peakExecutionMemory}")
info(s"executorRunTime: ${taskEnd.taskMetrics.executorRunTime}")
info(s"executorCpuTime: ${taskEnd.taskMetrics.executorCpuTime}")
...
}
here is the output is :
19/09/27 21:31:40 INFO SparkFSProcessor: peakExecutionMemory: 0
19/09/27 21:31:40 INFO SparkFSProcessor: executorRunTime: 1253
19/09/27 21:31:40 INFO SparkFSProcessor: executorCpuTime: 924518630
and i add a pr to this issue, please help review, many thans!
> Spark peakExecutionMemory metrics is zero
> -----------------------------------------
>
> Key: SPARK-29273
> URL: https://issues.apache.org/jira/browse/SPARK-29273
> Project: Spark
> Issue Type: Improvement
> Components: Spark Core
> Affects Versions: 2.4.3
> Environment: hadoop 2.7.3
> spark 2.4.3
> jdk 1.8.0_60
> Reporter: huangweiyi
> Priority: Major
>
> with spark 2.4.3 in our production environment, i want to get the
> peakExecutionMemory which is exposed by the TaskMetrics, but alway get the
> zero value
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]