xwu99 commented on pull request #33941:
URL: https://github.com/apache/spark/pull/33941#issuecomment-938380314


   > what exactly is the plan here, I know you wanted to get feedback, but are 
you going to add in check for all the resources to say they are compatible? 
Part of this comes down to other things as well. Like memory. I might have 
large containers with the same number of cores, are they ok to reuse. For 
instance I might have large containers that I'm using for ML vs ETL. So I think 
we need to define a policy in more detail.
   
   We can discuss this. Actually current reusing condition is conservative 
(just use cores) rather than flexible (memory and other resources). And need to 
consider if it's easy for end user to control the resource sharing (Your case 
with larger memory and same cores will be reused here, it's hard to consider 
memory, up to user) and make sense for the real use case. I found some use 
cases for reusing cores but for other resources such as GPU I didn't have a 
clear thought right now.  I will think over and discuss with you and welcome 
input.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to