Is there a documented/preferred method for installing PySpark on a local machine? I want to be able to run a Python interpreter on my local machine, point it to my Spark cluster and go. There doesn't appear to be a setup.py file anywhere, nor is pyspark registered with PyPI. I'm happy to contribute these, but want to hear what the preferred method is first.
Uri -- Uri Laserson, PhD Data Scientist, Cloudera Twitter/GitHub: @laserson +1 617 910 0447 laser...@cloudera.com