viirya commented on a change in pull request #31451:
URL: https://github.com/apache/spark/pull/31451#discussion_r604339191



##########
File path: 
sql/core/src/main/scala/org/apache/spark/sql/execution/metric/SQLMetricInfo.scala
##########
@@ -27,4 +27,5 @@ import org.apache.spark.annotation.DeveloperApi
 class SQLMetricInfo(
     val name: String,
     val accumulatorId: Long,
-    val metricType: String)
+    val metricType: String,
+    val aggregateMethod: (Array[Long], Array[Long]) => String)

Review comment:
       The only benefit is to not change existing tests. It is pretty minor to 
me. Except for test code, these objects for the classes like `SQLMetricInfo` 
are created from `SQLMetric` and we always need to assign the aggregate method 
from a `SQLMetric`.  It doesn't make sense to have default aggregate method in 
these classes.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to