GitHub user andrewor14 opened a pull request:
https://github.com/apache/spark/pull/2170
[HOTFIX] Wait for EOF only for the PySpark shell
In `SparkSubmitDriverBootstrapper`, we wait for the parent process to send
us an `EOF` before finishing the application. This is applicable for the
PySpark shell because we terminate the application the same way. However if we
run a python application, for instance, the JVM actually never exits unless it
receives a manual EOF from the user. This is causing a few tests to timeout.
We only need to do this for the PySpark shell because Spark submit runs as
a python subprocess only in this case. Thus, the normal Spark shell doesn't
need to go through this case even though it is also a REPL.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/andrewor14/spark bootstrap-hotfix
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/2170.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #2170
----
commit 42963f5460d331038dafe58f2707170d072a6df7
Author: Andrew Or <[email protected]>
Date: 2014-08-28T00:08:31Z
Do not wait for EOF unless this is the pyspark shell
Otherwise the application simply won't exit unless we manually
send an EOF.
----
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]