Github user freeman-lab commented on a diff in the pull request:

    https://github.com/apache/spark/pull/8318#discussion_r37580512
  
    --- Diff: python/pyspark/__init__.py ---
    @@ -36,6 +36,31 @@
           Finer-grained cache persistence levels.
     
     """
    +import os
    +import sys
    +
    +import xml.etree.ElementTree as ET
    +
    +if (os.environ.get("SPARK_HOME", "not found") == "not found"):
    --- End diff --
    
    Just to clarify, it's been fairly easy to get the `pyspark` launcher to 
launch via a notebook, either doing what @justinuang  said, or just setting 
`IPYTHON=1` (which basically does the same).
    
    But there are scenarios where it's useful to forgo the `pyspark` launcher 
entirely, i.e. launch with `ipython` and then do all the Spark-related stuff. 
One key use case is in containerized notebook deployments (like `tmpnb`) where 
we want a way to launch/deploy notebooks in a generic way (e.g. with 
`ipython`), but still let someone import and launch a `SparkContext`  after the 
fact. 
    
    This PR is a great step, we could get closer to that goal by adding more 
autodetection / path setting logic (as pointed out by @Carreau ). But I agree 
with @alope107  that it might be too brittle, and it would definitely be some 
work to support all the config / deployment modes that `SparkSubmit` handles 
now (which themselves change across releases). I suspect that's why the core 
devs have tried to force the language APIs to go through a common launcher, but 
would be good to get more input.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to