Github user andrewor14 commented on the pull request:
https://github.com/apache/spark/pull/30#issuecomment-41827629
I tried it with the latest master jar with your examples and both of them
work. What happens if you do the following on the submit node:
```
PYTHONPATH=<jar> python
...
>>> import pyspark
>>> import py4j
```
Then go to the `yarn.nodemanager.local-dirs` of each container node after
setting `yarn.nodemanager.delete.debug-delay-sec` to a high value (as you
previously suggested), and try the same on the spark.jar.
For me, I am able to import both pyspark and py4j directly on both the
submit node and the container nodes.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---