Setting JVM options to Spark executors in Standalone mode

2015-01-16 Thread Michel Dufresne
Hi All,

I'm trying to set some JVM options to the executor processes in a
standalone cluster. Here's what I have in *spark-env.sh*:

jmx_opt=-Dcom.sun.management.jmxremote
 jmx_opt=${jmx_opt} -Djava.net.preferIPv4Stack=true
 jmx_opt=${jmx_opt} -Dcom.sun.management.jmxremote.port=
 jmx_opt=${jmx_opt} -Dcom.sun.management.jmxremote.rmi.port=9998
 jmx_opt=${jmx_opt} -Dcom.sun.management.jmxremote.ssl=false
 jmx_opt=${jmx_opt} -Dcom.sun.management.jmxremote.authenticate=false
 jmx_opt=${jmx_opt} -Djava.rmi.server.hostname=${SPARK_PUBLIC_DNS}
 export SPARK_WORKER_OPTS=${jmx_opt}


However the option are showing up on the *daemon* JVM not the *workers*. It
has the same effect as if I was using SPARK_DAEMON_JAVA_OPTS (which should
set it on the daemon process).

Thanks in advance for your help,

Michel


Re: Setting JVM options to Spark executors in Standalone mode

2015-01-16 Thread Zhan Zhang
You can try to add it in in conf/spark-defaults.conf

 # spark.executor.extraJavaOptions  -XX:+PrintGCDetails -Dkey=value 
-Dnumbers=one two threeā€¯

Thanks.

Zhan Zhang

On Jan 16, 2015, at 9:56 AM, Michel Dufresne sparkhealthanalyt...@gmail.com 
wrote:

 Hi All,
 
 I'm trying to set some JVM options to the executor processes in a
 standalone cluster. Here's what I have in *spark-env.sh*:
 
 jmx_opt=-Dcom.sun.management.jmxremote
 jmx_opt=${jmx_opt} -Djava.net.preferIPv4Stack=true
 jmx_opt=${jmx_opt} -Dcom.sun.management.jmxremote.port=
 jmx_opt=${jmx_opt} -Dcom.sun.management.jmxremote.rmi.port=9998
 jmx_opt=${jmx_opt} -Dcom.sun.management.jmxremote.ssl=false
 jmx_opt=${jmx_opt} -Dcom.sun.management.jmxremote.authenticate=false
 jmx_opt=${jmx_opt} -Djava.rmi.server.hostname=${SPARK_PUBLIC_DNS}
 export SPARK_WORKER_OPTS=${jmx_opt}
 
 
 However the option are showing up on the *daemon* JVM not the *workers*. It
 has the same effect as if I was using SPARK_DAEMON_JAVA_OPTS (which should
 set it on the daemon process).
 
 Thanks in advance for your help,
 
 Michel


-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Setting JVM options to Spark executors in Standalone mode

2015-01-16 Thread Marcelo Vanzin
On Fri, Jan 16, 2015 at 10:07 AM, Michel Dufresne
sparkhealthanalyt...@gmail.com wrote:
 Thank for your reply, I've should have mentioned that spark-env.sh is the
 only option i found because:

- I'm creating the SpeakConf/SparkContext from a Play Application
(therefore I'm not using spark-submit script)

Then you can set that configuration Zhan mentions directly in your
SparkConf object.

BTW the env variable for what you want is SPARK_EXECUTOR_OPTS, but the
use of env variables to set app configuration is discouraged.


-- 
Marcelo

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org