Github user zjffdu commented on the pull request:
https://github.com/apache/spark/pull/11211#issuecomment-185149774
I verified it on spark-1.4.1 and spark-1.5.2, both of them have this issue.
I believe it exists for a long time. Besides that I found pyspark specific
environment variable is not propagated to driver if it is cluster mode. Create
SPARK-13360 for it, after SPARK-13360 I will update this PR.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]