Github user davies commented on the pull request:

    https://github.com/apache/spark/pull/6557#issuecomment-107707857
  
    SparkConf is the consistent way to manage configurations, we are moving 
away from environment variables since Spark 1.0, but still keep compatibility 
for old environment variables. 
    
    Sometimes, environment variables will be easy to use than SparkConf, for 
example, we can switch the version of Python in a single line:
    ```
    PYSPARK_PYTHON=pypy pypy xxx.py
    ```
    @JoshRosen may knows more about `PYSPARK_PYTHON`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to