Github user xuanyuanking commented on a diff in the pull request:
https://github.com/apache/spark/pull/23207#discussion_r239548704
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/metric/SQLShuffleMetricsReporter.scala
---
@@ -95,3 +96,59 @@ private[spark] object SQLShuffleMetricsReporter {
FETCH_WAIT_TIME -> SQLMetrics.createTimingMetric(sc, "fetch wait
time"),
RECORDS_READ -> SQLMetrics.createMetric(sc, "records read"))
}
+
+/**
+ * A shuffle write metrics reporter for SQL exchange operators. Different
with
+ * [[SQLShuffleReadMetricsReporter]], we need a function of (reporter =>
reporter) set in
+ * shuffle dependency, so the local SQLMetric should transient and create
on executor.
+ * @param metrics Shuffle write metrics in current SparkPlan.
+ * @param metricsReporter Other reporter need to be updated in this
SQLShuffleWriteMetricsReporter.
+ */
+private[spark] case class SQLShuffleWriteMetricsReporter(
+ metrics: Map[String, SQLMetric])(metricsReporter:
ShuffleWriteMetricsReporter)
--- End diff --
Reimplement done in a780b70.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]