cloud-fan commented on a change in pull request #31451:
URL: https://github.com/apache/spark/pull/31451#discussion_r604683543
##########
File path:
sql/core/src/main/scala/org/apache/spark/sql/execution/metric/SQLMetricInfo.scala
##########
@@ -27,4 +27,5 @@ import org.apache.spark.annotation.DeveloperApi
class SQLMetricInfo(
val name: String,
val accumulatorId: Long,
- val metricType: String)
+ val metricType: String,
Review comment:
Yes, but that's inevitable. Hopefully, most people will use the built-in
`CustomMetric` implementations, so that the history server can support it as
well. The worse case is people need to include the custom data source jar when
starting the history server JVM.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]