Github user JoshRosen commented on the pull request:
https://github.com/apache/spark/pull/2651#issuecomment-58273093
@davies Good points; we should definitely discuss this before 1.2. I guess
that the script also loads `spark-env.sh`, but this is less of an issue since
we moved most of the configuration to SparkConf.
Since we still need to support `spark-submit` and maintain
backwards-compatibility with existing uses of the `pyspark` script, what do you
think about this PR's approach? At a minimum, we need to preserve the old
behavior of the `IPYTHON=` setting. Does the `PYSPARK_DRIVER_PYTHON_OPTS` seem
like a reasonable thing to add to this script?
I'd still like to discuss the rest of your proposal, but I'd like to try to
get the fixes here merged first because the current master instructions are
broken and we need to re-introduce backwards-compatibility.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]