HyukjinKwon commented on code in PR #49816:
URL: https://github.com/apache/spark/pull/49816#discussion_r1943951278
##########
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala:
##########
@@ -2251,6 +2251,20 @@ object SQLConf {
.booleanConf
.createWithDefault(true)
+ val STATE_STORE_PARTITION_METRICS_REPORT_LIMIT =
+ buildConf("spark.sql.streaming.stateStore.numPartitionMetricsToReport")
+ .internal()
+ .doc("Maximum number of partition-level metrics to include in state
store progress " +
+ "reporting. The actual limit used will be the minimum of this
configuration and " +
+ "20% of the total number of partitions (with a minimum of 1
partition). This limits " +
+ "the metrics to the N partitions with the smallest values to prevent
the progress " +
+ "report from becoming too large.")
+ .version("1.0.0")
Review Comment:
Shouldn't it be 4.0.0?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]