xwu99 opened a new pull request #33385:
URL: https://github.com/apache/spark/pull/33385


   In stage-level resource scheduling, the allocated 3rd party resources can be 
obtained in TaskContext using resources() interface, however there is no API to 
get how many cpus are allocated for the task. Will add a cpus() interface to 
TaskContext in addition to resources().
   
   ### What changes were proposed in this pull request?
   Add cpus() interface in TaskContext and modify relevant code.
   
   ### Why are the changes needed?
   TaskContext has resources() to get 3rd party resources allocated. the is no 
API to get CPU allocated for the task.
   
   ### Does this PR introduce _any_ user-facing change?
   Add cpus() interface for TaskContext
   
   ### How was this patch tested?
   Add unit tests
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to