i need to run spark-submit inside a script with options that are build up programmatically. oh and i need to use exec to keep the same pid (so it can run as a service and be killed).
this is what i tried: ====================================================== #!/bin/bash -e SPARK_SUBMIT=/usr/local/lib/spark/bin/spark-submit OPTS="--class org.apache.spark.examples.SparkPi" OPTS+=" --driver-java-options \"-Da=b -Dc=d\"" echo $SPARK_SUBMIT $OPTS spark-examples_2.10-1.1.0.jar exec $SPARK_SUBMIT $OPTS spark-examples_2.10-1.1.0.jar ====================================================== no luck. it gets confused on the multiple java options it seems. i get: Exception in thread "main" java.lang.NoClassDefFoundError: "-Da=b Caused by: java.lang.ClassNotFoundException: "-Da=b at java.net.URLClassLoader$1.run(URLClassLoader.java:202) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:190) at java.lang.ClassLoader.loadClass(ClassLoader.java:306) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) at java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the main class: "-Da=b. Program will exit. i also tried many other ways of escaping the quoted java options. none of them work. strangely it does work if i replace the last line by (there is no science to this for me, i dont know much about bash, just trying random and probably bad things): eval exec $SPARK_SUBMIT $OPTS spark-examples_2.10-1.1.0.jar i am lost as to why... and there must be a better solution? it looks kinda nasty with the eval + exec best, koert