Github user HeartSaVioR commented on a diff in the pull request:
https://github.com/apache/spark/pull/21469#discussion_r206754359
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/metric/SQLMetrics.scala
---
@@ -81,10 +81,10 @@ class SQLMetric(val metricType: String, initValue: Long
= 0L) extends Accumulato
}
object SQLMetrics {
- private val SUM_METRIC = "sum"
- private val SIZE_METRIC = "size"
- private val TIMING_METRIC = "timing"
- private val AVERAGE_METRIC = "average"
+ val SUM_METRIC = "sum"
+ val SIZE_METRIC = "size"
+ val TIMING_METRIC = "timing"
+ val AVERAGE_METRIC = "average"
--- End diff --
It was to handle exception case while aggregating custom metrics,
especially filtering out average since it is not aggregated correctly. Since we
remove custom average metric, we no longer need to filter out them. Will revert
the change as well as relevant logic.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]