Hi guys,

I'm trying to enable logging in the executors but with no luck.

According to the oficial documentation and several blogs, this should be
done passing the
"spark.executor.extraJavaOpts=-Dlog4j.configuration=[my-file]" to the
spark-submit tool. I've tried both sending a reference to a classpath
resource as using the "file:" protocol but nothing happens. Of course in
the later case, I've used the --file option in the command line, although
is not clear where this file is uploaded in the worker machine.

However, I was able to make it work by setting the properties in the
spark-defaults.conf file pointing to each one of the configurations on the
machine. This approach has a big drawback though: if I change something in
the log4j configuration I need to change it in every machine (and I''m not
sure if restarting is required) which is not what I'm looking for.

The complete command I'm using is as follows:

bin/spark-submit --master spark://localhost:7077 --conf
"spark.driver.extraJavaOptions=-Dlog4j.configuration=env/dev/log4j-driver.properties"
--conf
"spark.executor.extraJavaOptions=-Dlog4j.configuration=env/dev/log4j-driver.properties"
--class [my-main-class] [my-jar].jar


Both files are in the classpath and are reachable -- already tested with
the driver.

Any comments will be welcomed.

Thanks in advance.
-carlos.

Reply via email to