Github user andrewor14 commented on a diff in the pull request:
https://github.com/apache/spark/pull/3274#discussion_r20398264
--- Diff:
core/src/main/scala/org/apache/spark/deploy/SparkSubmitDriverBootstrapper.scala
---
@@ -139,14 +139,14 @@ private[spark] object SparkSubmitDriverBootstrapper {
// subprocess there already reads directly from our stdin, so we
should avoid spawning a
// thread that contends with the subprocess in reading from System.in.
val isWindows = Utils.isWindows
- val isPySparkShell = sys.env.contains("PYSPARK_SHELL")
+ val isPySpark = sys.env.contains("PYSPARK")
if (!isWindows) {
val stdinThread = new RedirectThread(System.in,
process.getOutputStream, "redirect stdin")
stdinThread.start()
// For the PySpark shell, Spark submit itself runs as a python
subprocess, and so this JVM
// should terminate on broken pipe, which signals that the parent
process has exited. In
// Windows, the termination logic for the PySpark shell is handled
in java_gateway.py
--- End diff --
Then we should rephrase this comment:
```
/**
* The Spark submit JVM may run as a python subprocess, in which case the
* JVM should terminate on broken pipe, which signals that the parent
process
* has exited. This is the case if the application is launched directly from
* python, as in the PySpark shell. In Windows, the termination logic is
handled
* in java_gateway.py.
*/
```
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]