Github user rgbkrk commented on the issue:

    https://github.com/apache/spark/pull/15659
  
    Since there seems to be a question of why this is useful:
    
    As a user (and operator), I want pyspark to be pip installable so that I 
can:
    
    * `import pyspark`
    * set up a sparkcontext "by hand" with `pyFiles` at the start, set 
environment variables for executors, etc.
    * the jupyter notebook should be able to work without having to run an 
entirely separate "pyspark notebook"
    
    I should not have to resort to using 
[findspark](https://github.com/minrk/findspark) or having to add python paths 
dynamically at startup so I can `import pyspark`. [Searching around you'll find 
plenty of people wanting to do the same 
thing](https://www.google.com/?q=pyspark+import)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to