zecookiez commented on code in PR #49816:
URL: https://github.com/apache/spark/pull/49816#discussion_r1963133546
##########
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/state/StateStore.scala:
##########
@@ -321,6 +335,73 @@ case class StateStoreCustomTimingMetric(name: String,
desc: String) extends Stat
SQLMetrics.createTimingMetric(sparkContext, desc)
}
+trait StateStoreInstanceMetric {
+ def metricPrefix: String
+ def descPrefix: String
+ def descNotes: String
+ def partitionId: Option[Int]
+ def storeName: String
+ def stateStoreProvider: String
Review Comment:
That's my bad I forgot to address this one, I added this to distinguish the
state store metrics since similar metrics would be added to HDFS state stores
in a separate change.
I've just removed this field as I don't think this is particularly necessary
👍
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]