Hi all,

I am running spark standalone (local[*]), and have tried to cut back on
some of the logging noise from the framework by editing log4j.properties in
spark/conf.
The following lines are working as expected:

log4j.logger.org.apache.spark=WARN
log4j.logger.org.apache.spark.storage.BlockManager=ERROR

(I've even guaranteed that it's definitely using my configuration by adding
a prefix to the conversion pattern).

However, I am still getting log messages at INFO from classes like:
org.apache.spark.Logging$class    (should be covered by the
org.apache.spark setting)
kafka.utils.Logging$class                (when I add a similar setting for
kafka.utils)

I suspect it's because these are inner classes. It still happens even when
I go up a level and add configurations like "log4j.logger.org=WARN".

Is this a known bug in log4j? Is there any known way to suppress these,
ideally through configuration rather than programmatically?

Many thanks

Reply via email to