Stefaan Lippens created SPARK-54434:
---------------------------------------
Summary: find-spark-home should not "exit"
Key: SPARK-54434
URL: https://issues.apache.org/jira/browse/SPARK-54434
Project: Spark
Issue Type: Bug
Components: PySpark
Affects Versions: 4.1.0
Reporter: Stefaan Lippens
the find-spark-home utility (
https://github.com/apache/spark/blob/d1af9a305718d89b9260987f31323e809611966a/bin/find-spark-home
) is intended to be "sourced":
bq. ... Should be included using "source" directive.
It has "short-circuit" to skip everything if SPARK_HOME is already set:
{code:bash}
# Short circuit if the user already has this set.
if [ ! -z "${SPARK_HOME}" ]; then
exit 0
elif ...
{code}
But this does an {{exit 0}}, which practically means that the active shell
session of the user (where they are sourcing this script) is terminated.
For example, if you accidentally execute "source find-spark-home" twice, you
lose your shell
instead of {{exit 0}}, I think it should use {{return 0}} or just do nothing
(as there is nothing else after the whole if construct)
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]