Hi,

I was looking for metrics specifying how many objects ("files") were read /
written when using Spark over S3.

The metrics specified at [
https://spark.apache.org/docs/3.5.1/monitoring.html#component-instance--executor]
do not have objects written / read from s3 metric.

I do see the Hadoop dependency Spark is using to read/write from S3 does
have S3Instrumentation which seems to have rich metrics.

I was wondering is there a place I've missed, to take those read/write
objects count metric ?


Thanks,

Asaf

Reply via email to