Hi,

Out of curiosity, I have tried to replace the dependence on bash by sh
in the different scripts to launch Spark daemons and jobs. So far,
most scripts work with sh, except "bin/spark-class". The culprit is
the while loop that compose the final command by parsing the output of
launcher library.

After a bit of fiddling, I found that the following loop:
CMD=()
while IFS= read -d '' -r ARG; do
CMD+=("$ARG")
done < <("$RUNNER" -cp "$LAUNCH_CLASSPATH" org.apache.spark.launcher.Main "$@")

could be replaced by a single line:
CMD=$("$RUNNER" -cp "$LAUNCH_CLASSPATH" org.apache.spark.launcher.Main
"$@" | xargs -0)

The while loop cannot be executed with sh, while the single line can
be. Since on my system, sh is simply a link on bash, with some options
activated, I guess this simply means that the while loop syntax is not
posix compliant. Which spawns two questions:

1. Would it be useful to make sure Spark scripts are POSIX compliant?
2. Is the simplification of spark-class enough to consider making a
pull-request?

Thanks,
Felix-Antoine Fortin

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to