GitHub user andrewor14 opened a pull request:
https://github.com/apache/spark/pull/2863
[SPARK-4013] Do not create multiple actor systems on each executor
In the existing code, each coarse-grained executor has two concurrently
running actor systems. This causes many more error messages to be logged than
necessary when the executor is lost or killed because we receive a
disassociation event for each of these actor systems.
This blocks #2840.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/andrewor14/spark executor-actor-system
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/2863.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #2863
----
commit 44ce2e0961b8233f8607905a85c90488aed4b53d
Author: Andrew Or <[email protected]>
Date: 2014-10-20T21:50:47Z
Avoid starting two actor systems on each executor
This also includes a minor refactor to expose a nicer interface
to create SparkEnv's on drivers and executors.
----
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]