Ngone51 commented on code in PR #36162:
URL: https://github.com/apache/spark/pull/36162#discussion_r895715985
##########
core/src/main/scala/org/apache/spark/internal/config/package.scala:
##########
@@ -2073,6 +2073,37 @@ package object config {
.timeConf(TimeUnit.MILLISECONDS)
.createOptional
+ private[spark] val SPECULATION_EFFICIENCY_ENABLE =
+ ConfigBuilder("spark.speculation.efficiency.enabled")
+ .doc("When set to true, spark will evaluate the efficiency of task
processing through the " +
+ "stage task metrics and only need to speculate the inefficient tasks.
A task is " +
+ "inefficient when its data process rate is less than the average data
process " +
+ "rate of all successful tasks in the stage multiplied by a
multiplier.")
+ .version("3.4.0")
+ .booleanConf
+ .createWithDefault(true)
+
+ private[spark] val SPECULATION_EFFICIENCY_TASK_PROCESS_MULTIPLIER =
+ ConfigBuilder("spark.speculation.efficiency.process.multiplier")
Review Comment:
Accodring to the config naming policy, we should only keep the namespace
when there're multiple configs under it. So if we don't have other config under
the namespace `spark.speculation.efficiency.process`, we should name it as
`spark.speculation.efficiency.processMultiplier`
##########
core/src/main/scala/org/apache/spark/internal/config/package.scala:
##########
@@ -2073,6 +2073,37 @@ package object config {
.timeConf(TimeUnit.MILLISECONDS)
.createOptional
+ private[spark] val SPECULATION_EFFICIENCY_ENABLE =
+ ConfigBuilder("spark.speculation.efficiency.enabled")
+ .doc("When set to true, spark will evaluate the efficiency of task
processing through the " +
+ "stage task metrics and only need to speculate the inefficient tasks.
A task is " +
+ "inefficient when its data process rate is less than the average data
process " +
+ "rate of all successful tasks in the stage multiplied by a
multiplier.")
+ .version("3.4.0")
+ .booleanConf
+ .createWithDefault(true)
+
+ private[spark] val SPECULATION_EFFICIENCY_TASK_PROCESS_MULTIPLIER =
+ ConfigBuilder("spark.speculation.efficiency.process.multiplier")
+ .doc("A multiplier for evaluating the efficiency of task processing. A
task is inefficient " +
+ "when its data process rate is less than the average data process rate
of all " +
+ "successful tasks in the stage multiplied by the multiplier.")
+ .version("3.4.0")
+ .doubleConf
+ .checkValue(v => v > 0.0 && v <= 1.0, "multiplier must be in (0.0, 1.0]")
+ .createWithDefault(0.75)
+
+ private[spark] val SPECULATION_EFFICIENCY_TASK_DURATION_FACTOR =
+ ConfigBuilder("spark.speculation.efficiency.duration.factor")
Review Comment:
ditto.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]