still struggling with SPARK_JAVA_OPTS being deprecated. i am using spark
standalone.

for example if i have a akka timeout setting that i would like to be
applied to every piece of the spark framework (so spark master, spark
workers, spark executor sub-processes, spark-shell, etc.). i used to do
that with SPARK_JAVA_OPTS. now i am unsure.

SPARK_DAEMON_JAVA_OPTS works for the master and workers, but not for the
spark-shell i think? i tried using SPARK_DAEMON_JAVA_OPTS, and it does not
seem that useful. for example for a worker it does not apply the settings
to the executor sub-processes, while for SPARK_JAVA_OPTS it does do that.
so seems like SPARK_JAVA_OPTS is my only way to change settings for the
executors, yet its deprecated?


On Wed, Jun 11, 2014 at 10:59 PM, elyast <lukasz.jastrzeb...@gmail.com>
wrote:

> Hi,
>
> I tried to use SPARK_JAVA_OPTS in spark-env.sh as well as conf/java-opts
> file to set additional java system properties. In this case I could connect
> to tachyon without any problem.
>
> However when I tried setting executor and driver extraJavaOptions in
> spark-defaults.conf it doesn't.
>
> I suspect the root cause may be following:
>
> SparkSubmit doesn't fork additional JVM to actually run either driver or
> executor process and additional system properties are set after JVM is
> created and other classes are loaded. It may happen that Tachyon CommonConf
> class is already being loaded and since its Singleton it won't pick up and
> changes to system properties.
>
> Please let me know what do u think.
>
> Can I use conf/java-opts ? since it's not really documented anywhere?
>
> Best regards
> Lukasz
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/little-confused-about-SPARK-JAVA-OPTS-alternatives-tp5798p7448.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Reply via email to