Github user JoshRosen commented on a diff in the pull request:
https://github.com/apache/spark/pull/13503#discussion_r69192407
--- Diff: bin/pyspark ---
@@ -52,7 +52,7 @@ fi
# Determine the Python executable to use for the executors:
if [[ -z "$PYSPARK_PYTHON" ]]; then
- if [[ $PYSPARK_DRIVER_PYTHON == *ipython* && $DEFAULT_PYTHON !=
"python2.7" ]]; then
+ if [[ $PYSPARK_DRIVER_PYTHON == *ipython* && $DEFAULT_PYTHON <
"python2.7" ]]; then
--- End diff --
If the default Python executable is named `python` but actually points to a
Python 2.7+ or Python 3 executable then this is going to return a false
positive because the string `python` is `< python2.7`.
Rather than relying on the name of the executable, it might make more sense
to shell out to Python and ask it its version. For example, I think the
following will work:
```bash
$DEFAULT_PYTHON -c 'import os; import sys; sys.exit(int(sys.version_info <=
(2, 7, 0)))'
```
If the Python version is greater than or equal to 2.7.0, then this will
exit with 0; otherwise, this will exit with 1. This lets you check the exit
status and not rely on the executable name.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]