Yes each application can use its own log4j.properties but I am not sure how to configure log4j so that the driver and executor write to file. This is because if we set the "spark.executor.extraJavaOptions" it will read from a file and that is not what I need. How do I configure log4j from the app so that the driver and the executors use these configs?
Thanks, Udit On Sat, Mar 21, 2015 at 3:13 AM, Jeffrey Jedele <[email protected]> wrote: > Hi, > I'm not completely sure about this either, but this is what we are doing > currently: > Configure your logging to write to STDOUT, not to a file explicitely. > Spark will capture stdour and stderr and separate the messages into a > app/driver folder structure in the configured worker directory. > > We then use logstash to collect the logs and index them to a elasticsearch > cluster (Spark seems to produce a lot of logging data). With some simple > regex processing, you also get the application id as searchable field. > > Regards, > Jeff > > 2015-03-20 22:37 GMT+01:00 Ted Yu <[email protected]>: > >> Are these jobs the same jobs, just run by different users or, different >> jobs ? >> If the latter, can each application use its own log4j.properties ? >> >> Cheers >> >> On Fri, Mar 20, 2015 at 1:43 PM, Udit Mehta <[email protected]> wrote: >> >>> Hi, >>> >>> We have spark setup such that there are various users running multiple >>> jobs at the same time. Currently all the logs go to 1 file specified in the >>> log4j.properties. >>> Is it possible to configure log4j in spark for per app/user logging >>> instead of sending all logs to 1 file mentioned in the log4j.properties? >>> >>> Thanks >>> Udit >>> >> >> >
