Hi all!

I would love to use Spark with a somewhat more modern logging framework
than Log4j 1.2. I have Logback in mind, mostly because it integrates well
with central logging solutions such as the ELK stack. I've read up a bit on
getting Spark 2.0 (that's what I'm using currently) to work with anything
else than Log4j 1.2, and it seems nigh impossible.

If I understood the problem correctly from the various JIRA issues I read,
it seems Spark needs to be able to set the log-level programmatically,
which slf4j doesn't support, and as such, Spark integrates with Log4j 1.2
on a deep level.

My question: why would Spark want to set log levels programmatically? Why
not leave it to the user of Spark to provide a logging configuration that
suits his/her needs? That way, the offending code that integrates with
Log4j directly, could be removed, and Spark can start relying only on the
slf4j API, as any good library should.

I'm curious about the motivations of the Spark dev team on this!

Daan

Reply via email to