tgravescs commented on a change in pull request #33385:
URL: https://github.com/apache/spark/pull/33385#discussion_r673992181



##########
File path: core/src/main/scala/org/apache/spark/scheduler/TaskSetManager.scala
##########
@@ -429,6 +429,7 @@ private[spark] class TaskSetManager(
       execId: String,
       host: String,
       maxLocality: TaskLocality.TaskLocality,
+      taskCpuAssignments: Int = 1,

Review comment:
       why default 1 here? I'd rather not see a default since task cpu isn't 
optional.  looks like javadoc needs updating, looks like I missed 
taskResourceAssigments.

##########
File path: core/src/main/scala/org/apache/spark/scheduler/TaskSetManager.scala
##########
@@ -495,6 +497,7 @@ private[spark] class TaskSetManager(
       index: Int,
       taskLocality: TaskLocality.Value,
       speculative: Boolean,
+      taskCpuAssignments: Int,

Review comment:
       nit, rename to be just taskCpus or numTaskCpus, same with above in 
resourceOffer. We don't assign specific cpus

##########
File path: core/src/main/scala/org/apache/spark/TaskContext.scala
##########
@@ -177,6 +177,12 @@ abstract class TaskContext extends Serializable {
    */
   def getLocalProperty(key: String): String
 
+  /**
+   * CPUs allocated to the task.
+   */
+  @Evolving

Review comment:
       probably should add the @Since

##########
File path: core/src/test/scala/org/apache/spark/scheduler/TaskContextSuite.scala
##########
@@ -92,7 +92,7 @@ class TaskContextSuite extends SparkFunSuite with 
BeforeAndAfter with LocalSpark
       0, 0, taskBinary, rdd.partitions(0), Seq.empty, 0, new Properties,
       closureSerializer.serialize(TaskMetrics.registered).array())
     intercept[RuntimeException] {
-      task.run(0, 0, null, null, Option.empty)
+      task.run(0, 0, null, 0, null, Option.empty)

Review comment:
       it would be nice to have a test with stage level scheduling included, 
perhaps look at SparkContextSuite, also having a value other then 1 would be 
good.

##########
File path: 
core/src/test/scala/org/apache/spark/scheduler/TaskDescriptionSuite.scala
##########
@@ -76,6 +76,7 @@ class TaskDescriptionSuite extends SparkFunSuite {
       originalJars,
       originalArchives,
       originalProperties,
+      cpus = 1,

Review comment:
       can we use value > 1 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to