Github user tmckayus commented on a diff in the pull request:
https://github.com/apache/spark/pull/21572#discussion_r195806962
--- Diff:
resource-managers/kubernetes/docker/src/main/dockerfiles/spark/entrypoint.sh ---
@@ -38,10 +38,10 @@ fi
SPARK_K8S_CMD="$1"
if [ -z "$SPARK_K8S_CMD" ]; then
- echo "No command to execute has been provided." 1>&2
- exit 1
+ echo "No command to execute has been provided. Ignoring spark-on-k8s
workflow..." 1>&2
+else
+ shift 1
--- End diff --
this doesn't quite work, the -z test is effectively checking only whether
$1 was empty or not.
If it's non-empty, but it is *not* a recognized spark-on-k8s command (ie
driver, driver-py, or executor), it's a passthrough command and therefore we
cannot shift anything. As it is, this would consume something like
"/usr/libexec/s2i/assembly.sh" and make it disappear.
Personally, I would do somethng like this and take an early out in the
unsupported case, skipping all the other environment processing
```bash
case "$SPARK_K8S_CMD in
driver | driver-py | executor)
shift 1
;;
*)
echo "No SPARK_K8S_CMD provided: proceeding in pass-through mode..."
exec /sbin/tini -s -- "$@"
;;
esac
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]