GitHub user MartinWeindel opened a pull request:
https://github.com/apache/spark/pull/1860
work around for problem with Mesos offering semantic
When using Mesos with the fine-grained mode, a Spark job can run into a
dead lock on low allocatable memory on Mesos slaves. As a work-around 32 MB (=
Mesos MIN_MEM) are allocated for each task, to ensure Mesos making new offers
after task completion.
From my perspective, it would be better to fix this problem in Mesos by
dropping the constraint on memory for offers, but as temporary solu
See [[MESOS-1688] No offers if no memory is
allocatable](https://issues.apache.org/jira/browse/MESOS-1688) for details for
this problem.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/MartinWeindel/spark master
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/1860.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #1860
----
commit d9d2ca61ee35eedda23e15182f5b2e19aaf62e23
Author: Martin Weindel <[email protected]>
Date: 2014-08-08T20:44:44Z
work around for problem with Mesos offering semantic (see
[https://issues.apache.org/jira/browse/MESOS-1688])
----
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]