Hi all,

I need to configure spark executor log4j.properties on a standalone cluster. 
It looks like placing the relevant properties file in the spark
configuration folder and  setting the spark.executor.extraJavaOptions from
my application code:
sparkConf.set("spark.executor.extraJavaOptions",
"-Dlog4j.configuration=log4j_special.properties");
does the work, and the executor logs are written in the required place and
level. As far as I understand, it works, because the spark configuration
folder is on the class path, and passing parameter without path works here.
However, I would like to avoid deploying these properties to each worker
spark configuration folder.
I wonder, if I put the properties in my application jar, is there any way of
telling executor to load them?

Thanks,



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Configuring-logging-properties-for-executor-tp22572.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to