Github user windkit commented on the issue:
https://github.com/apache/spark/pull/19510
@susanxhuynh Thanks for reviewing.
I want to use both `spark.mem.max` and `spark.cores.max` to limit resource
one task can use within the cluster.
Now I am setting up a common cluster for several users, they are allowed to
configure `spark.executor.cores` and `spark.executor.memory` according to their
need. I then need a limit for both cpu cores and memory.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]