Github user pwendell commented on the pull request:

    https://github.com/apache/spark/pull/30#issuecomment-40527887
  
    @sryza I think this is looking good. I played around with this on a local 
yarn install and it worked. The only points are twofold. Could we ditch 
requiring SPARK_JAR? I'm going to merge a patch shortly that removes that 
requirement. Also, we just automatically create the pyspark zip file and not 
expose this to the user?
    
    Eventually we'll probably bundle this inside of the Spark assembly... but 
in the mean time having a thing that "just works" for users where they don't 
have to e.g. set environment variables would be nice.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

Reply via email to