abellina commented on a change in pull request #26078: [SPARK-29151][CORE]
Support fractional resources for task resource scheduling
URL: https://github.com/apache/spark/pull/26078#discussion_r341695371
##########
File path: core/src/main/scala/org/apache/spark/resource/ResourceUtils.scala
##########
@@ -94,8 +112,28 @@ private[spark] object ResourceUtils extends Logging {
def parseResourceRequirements(sparkConf: SparkConf, componentName: String)
: Seq[ResourceRequirement] = {
- parseAllResourceRequests(sparkConf, componentName).map { request =>
- ResourceRequirement(request.id.resourceName, request.amount)
+ listResourceIds(sparkConf, componentName).map { resourceId =>
+ val settings = sparkConf.getAllWithPrefix(resourceId.confPrefix).toMap
+ val amountDouble = settings.getOrElse(AMOUNT,
+ throw new SparkException(s"You must specify an amount for
${resourceId.resourceName}")
+ ).toDouble
Review comment:
I am adding a header to the case class `ResourceRequirement` to help clear
this. `AMOUNT` only makes sense as floating point when working with task
resource requests. `ResourceRequirements` are at the executor level, and it
doesn't make sense to have a fractional resource there, so those should remain
being integers. Perhaps `AMOUNT` is a bit too general, but I am not sure we
want to change that at this point.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]