Hey guys, Looking for a bit of help on logging.

I trying to get Spark to write log4j logs per job within a Spark cluster. 
So for example, I'd like:

<$SPARK_HOME>/logs/job1.log.x
<$SPARK_HOME>/logs/job2.log.x

And I want this on the driver and on the executor.

I'm trying to accomplish this by using a log4j.properties file in each job
resource, but isn't logging properly.

How can I get job level log on the executor and driver?

Thanks in advance for taking the time to respond.

D



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Log4j-files-per-spark-job-tp22106.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to