Hi Steve, I wrote a blog post on the configuration of Spark that we've used, including the log4j.properties: http://progexc.blogspot.co.il/2014/12/spark-configuration-mess-solved.html (What we did was to distribute the relevant *log4j.properties* file to all of the slaves to the same location)
Hope this helps, Good luck! Quoting from the text in the post: ------------------------------------------- The right way to pass the parameter is through the property: “*spark.driver.extraJavaOptions*” and “*spark.executor.extraJavaOptions*”: I’ve passed both the log4J configurations property and the parameter that I needed for the configurations. (To the Driver I was able to pass only the log4j configuration). For example (was written in properties file passed in spark-submit with “ *—properties-file*”): *“* spark.driver.extraJavaOptions –Dlog4j.configuration=file:///spark/conf/log4j.properties - spark.executor.extraJavaOptions –Dlog4j.configuration=file:///spark/conf/log4j.properties -Dapplication.properties.file=hdfs:///some/path/on/hdfs/app.properties spark.application.properties.file hdfs:///some/path/on/hdfs/app.properties *“* On Tue, Apr 12, 2016 at 7:51 PM, Steve Lewis <lordjoe2...@gmail.com> wrote: > Ok I am stymied. I have tried everything I can think of to get spark to > use my own version of > > log4j.properties > > In the launcher code - I launch a local instance from a Java application > > I say -Dlog4j.configuration=conf/log4j.properties > > where conf/log4j.properties is user.dir - no luck > > Spark always starts saying > > Using Spark's default log4j profile: > org/apache/spark/log4j-defaults.properties > > I have a directory conf with my log4j.properties there but it seems to be > ignored > > I use maven and an VERY RELUCTANT to edit the spark jars > > I know this point has been discussed here before but I do not see a clean > answer > > > > > > > -- Best regards, Demi Ben-Ari <http://il.linkedin.com/in/demibenari> Entrepreneur @ Stealth Mode Startup Twitter: @demibenari <https://twitter.com/demibenari>