Github user tgravescs commented on the pull request:

    https://github.com/apache/spark/pull/30#issuecomment-41832999
  
    just trying it from my build directory it doesn't find pyspark so perhaps 
my jar isn't built right. Although when I look at the jar its in there. 
    
    $ 
PYTHONPATH=assembly/target/scala-2.10/spark-assembly-1.0.0-SNAPSHOT-hadoop2.4.0.jar
  python
    Python 2.6.6 (r266:84292, May 27 2013, 05:35:12) 
    [GCC 4.4.7 20120313 (Red Hat 4.4.7-3)] on linux2
    Type "help", "copyright", "credits" or "license" for more information.
    >>> import pyspark
    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
    ImportError: No module named pyspark
    
    $ jar -tvf 
assembly/target/scala-2.10/spark-assembly-1.0.0-SNAPSHOT-hadoop2.4.0.jar  | 
grep pyspark
         0 Wed Apr 30 14:45:44 UTC 2014 pyspark/
      8970 Wed Apr 30 14:45:44 UTC 2014 pyspark/tests.py
      4080 Wed Apr 30 14:45:44 UTC 2014 pyspark/worker.py
      ...
      ...
      4333 Wed Apr 30 14:45:44 UTC 2014 pyspark/statcounter.py


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

Reply via email to