Github user bzz commented on the pull request:

    https://github.com/apache/incubator-zeppelin/pull/118#issuecomment-115448347
  
    Thank you very much for contributing this!
    
    
    It would be great to have a high level summary of the changes, so please 
correct me in case I miss-understand something:
    
    This PR allows users of pyspark skip setting pythonpath env var, copy 
Python modules on every node of he cluster and have spark installed(in case of 
pyspark in local mode on 1 machine) by adding new artefact to the Zeppelin 
build, a python, hidden behind optional build profile, that brings py4j as well 
as Python code of pyspark by downloading (and caching) actual spark 
distribution and re-packing those to a zip file, available in Z class path at 
runtime.
    
    Is that correct?
    If it is - that sounds great to me, pspark not working in Z with local 
interpreter without having spark installed was very frustrating.
    
    One question: a Python dir is not a maven submodule now but may be it 
should be, one day, what do you guys think?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

Reply via email to