mridulm commented on pull request #30370:
URL: https://github.com/apache/spark/pull/30370#issuecomment-727133352


   > I want an executor with 2 cores and 6 gb, but only 1 core used for task 
allocation, which means at most 1 task could be running on this executor. If I 
use existing configs, there would be at most 2 tasks. I want to give each task 
6GB memory, but also use extra cpu time for gc or other things.
   
   GC is handled by the VM - we dont need an additional core for it.
   Without a good usecase, I am not in favor of adding additional 
configurations.
   
   Having said that, if the requirement is strictly what you mentioned : 
allocate two cores and 6gb, run only 1 task and leave 1 core around for "other 
things" (let me assume some background processing ? Not sure what is happening) 
: you can use spark.task.cpus=2 
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to