wypoon commented on a change in pull request #31451:
URL: https://github.com/apache/spark/pull/31451#discussion_r605134483



##########
File path: 
sql/core/src/main/scala/org/apache/spark/sql/execution/metric/SQLMetricInfo.scala
##########
@@ -27,4 +27,5 @@ import org.apache.spark.annotation.DeveloperApi
 class SQLMetricInfo(
     val name: String,
     val accumulatorId: Long,
-    val metricType: String)
+    val metricType: String,
+    val aggregateMethod: (Array[Long], Array[Long]) => String)

Review comment:
       I understand your point. It is just a suggestion. Existing unit tests do 
not exercise custom metrics, which is a new feature. In my opinion, being able 
to add this new support in a way that does not necessitate any code change in 
tests is actually a plus; it shows that the implementation is well isolated 
(which is, of course, not always possible). New tests that test the new feature 
would obviously exercise the new parts of the code. That is just my opinion. At 
the end of the day, this is a minor point.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to