Hi, You can create log4j.properties for executors, and use "--files > log4j.properties" when submitting
In the case when we are initializing spark context via java, how can we pass the same parameter? jsc = new JavaSparkContext(conf); Is it possible to set this parameter in spark-defaults.conf? spark jobs. > On Mon, Feb 29, 2016 at 1:50 PM, Niranda Perera <[hidden email] > <http://apache-spark-developers-list.1001551.n3.nabble.com/user/SendEmail.jtp?type=node&node=16490&i=0> > > wrote: > Hi all, > Is there any possibility to control the stdout and stderr streams in an > executor JVM? > I understand that there are some configurations provided from the spark > conf as follows > spark.executor.logs.rolling.maxRetainedFiles > spark.executor.logs.rolling.maxSize > spark.executor.logs.rolling.strategy > spark.executor.logs.rolling.time.interval > But is there a possibility to have more fine grained control over these, > like we do in a log4j appender, with a property file? > Rgds > -- > Niranda > > > -- > Best Regards > Jeff Zhang Regards, -- *Anuruddha Premalala (MIEEE)Mobile : +94717213122E-mail : anuruddhaprema...@gmail.com <anuruddhaprema...@gmail.com>web : www.anuruddha.org <http://www.anuruddha.org>Sri Lanka.*