Github user viirya commented on the issue:

    https://github.com/apache/spark/pull/15541
  
    One concern I have is the cluster of machines with different codes. It is 
possibly if you construct your cluster with solution like ec2 and buy different 
kinds of nodes. In this case, the balance assigner would first consume machines 
with higher cores, and packed assigner would consume machines with less codes. 
I don't know if this is an issue to most of you. Even it is, this might not be 
able to solve in this change and we can consider this later.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to