Github user andrewor14 commented on the pull request:
https://github.com/apache/spark/pull/1197#issuecomment-47137043
The thing is pyspark is still broken even if we fix (a) but not (b). For
example, if your driver cannot communicate with the master somehow, it normally
prints the warning messages "Cannot connect to master" or something. If Spark
logging is masked, then running `sc.parallelize` in this case still hangs
without any output. This is actually the case I personally ran into in the
first place.
Since, issues (a) and (b) are related and have a common simple fix, I think
it makes sense to fix them both at once. I agree that (c) should be a new issue
and is outside of the scope of this issue. For now, I just want to make sure
pyspark is not broken on master.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---