Peter Podlovics created SPARK-33320:
---------------------------------------

             Summary: ExecutorMetrics are not written to CSV and StatsD sinks
                 Key: SPARK-33320
                 URL: https://issues.apache.org/jira/browse/SPARK-33320
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 2.2.1
         Environment: I used the following configuration while running Spark on 
YARN:
{noformat}
spark.metrics.executorMetricsSource.enabled=true
spark.eventLog.logStageExecutorMetrics=true
spark.metrics.conf.*.sink.csv.class=org.apache.spark.metrics.sink.CsvSink
spark.metrics.conf.*.sink.servlet.class=org.apache.spark.metrics.sink.MetricsServlet
spark.metrics.conf.*.sink.servlet.path=/home/hadoop/metrics/json
spark.metrics.conf.*.sink.statsd.class=org.apache.spark.metrics.sink.StatsdSink
spark.metrics.conf.*.sink.statsd.host=localhost
spark.metrics.conf.*.sink.statsd.port=8125
spark.metrics.conf.*.sink.statsd.period=10
spark.metrics.conf.*.sink.statsd.unit=seconds
spark.metrics.conf.*.sink.statsd.prefix=spark
master.sink.servlet.path=/home/hadoop/metrics/master/json
applications.sink.servlet.path=/home/hadoop/metrics/applications/json
{noformat}
            Reporter: Peter Podlovics


Metrics from the {{ExecutorMetrics}} namespace are not written to the CSV and 
StatsD sinks, even though some of them is available through the REST API (e.g.: 
{{memoryMetrics.usedOnHeapStorageMemory}}).

I couldn't find the {{ExecutorMetrics}} either on the driver or the workers. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to