GitHub user windkit opened a pull request:

    https://github.com/apache/spark/pull/19510

    [SPARK-22292][Mesos] Added spark.mem.max support for Mesos

    ## What changes were proposed in this pull request?
    
    To limit the amount of resources a spark job accept from Mesos, currently 
we can only use `spark.cores.max` to limit in terms of cpu cores.
    However, when we have big memory executors, it would consume all the 
resources.
    
    This PR added `spark.mem.max` option for Mesos
    
    ## How was this patch tested?
    
    Added Unit Test


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/windkit/spark mem_max

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/19510.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #19510
    
----
commit 7c9a1610291f5a98cc47447028d5378caffd3c51
Author: Li, YanKit | Wilson | RIT <[email protected]>
Date:   2017-10-17T05:54:06Z

    [SPARK-22292][Mesos] Added spark.mem.max support for Mesos

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to