Github user holdenk commented on a diff in the pull request:
https://github.com/apache/spark/pull/15659#discussion_r86706618
--- Diff: python/pyspark/find_spark_home.py ---
@@ -59,15 +59,15 @@ def is_spark_home(path):
paths.append(os.path.join(module_home, "../../"))
except ImportError:
# Not pip installed no worries
- True
+ pass
# Normalize the paths
paths = [os.path.abspath(p) for p in paths]
try:
return next(path for path in paths if is_spark_home(path))
except StopIteration:
- print("Could not find valid SPARK_HOME while searching
%s".format(paths), file=sys.stderr)
+ print("Could not find valid SPARK_HOME while searching
{0}".format(paths), file=sys.stderr)
--- End diff --
Added, sorry (got left out when I was working on the change to for edit
mode tests and I reset part of the way through).
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]