GitHub user davies opened a pull request:

    https://github.com/apache/spark/pull/2743

    [SPARK-3888] [PySpark] limit the memory used by worker

    The memory limit of Python worker can be configurable by 
`spark.executor.python.memory.limit` (in MB), which is 0 by default (means 
unlimited).
    
    it only works in Linux, so I did not put it in the docs, it will be useful 
for some advanced users (also no tests).

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/davies/spark limit

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/2743.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #2743
    
----
commit 52fab51c37daf7997b1770620a16169081c30567
Author: Davies Liu <[email protected]>
Date:   2014-10-10T01:09:58Z

    limit the memory used by worker
    
    it only works in Linux

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to