Github user jhlch commented on the pull request:

    https://github.com/apache/spark/pull/8318#issuecomment-209955734
  
    Pyspark being pip installable would be useful to many users. I've packaged
    jars in pip installable python modules before, and will take a stab at
    here. I'm going to take a different approach than the original author. it
    seems to me that an already installed Spark jar and validating version
    compatibility will be error prone and will likely cause trouble for users.
    I'm intending to package the spark jar as part of the module. This will
    mean that there is an order of operations for the deployment to PyPi to be
    successful. I am going to assume that `mvn clean package/install` has
    already happened in the build and that the requisite jar is in target/.
    
    Making this change is going to create a new step in the build and
    publishing process. Who is responsible for that for Spark?
    
    Is Spark supported/expected to work on Windows? I'm confident that the
    packaging I'm planning to do will work on unix-like systems but have never
    tested it on Windows and cross platform compatibility for this packaging
    step can be tricky/not guaranteed.
    
    On Tue, Apr 12, 2016 at 3:14 PM, Nicholas Chammas <[email protected]>
    wrote:
    
    > No worries. I just wanted to make sure that the idea was sound since there
    > were concerns early on about whether we should even try to package PySpark
    > independently.
    >
    > —
    > You are receiving this because you commented.
    > Reply to this email directly or view it on GitHub
    > <https://github.com/apache/spark/pull/8318#issuecomment-209125293>
    >



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to