vanzin commented on a change in pull request #23675: [SPARK-26753][CORE] Fix
for ensuring custom log levels work for spark-shell
URL: https://github.com/apache/spark/pull/23675#discussion_r251647820
##########
File path: core/src/main/scala/org/apache/spark/internal/Logging.scala
##########
@@ -213,7 +205,21 @@ private[spark] object Logging {
rootLogger.setLevel(defaultRootLevel)
rootLogger.getAllAppenders().asScala.foreach {
case ca: ConsoleAppender =>
- ca.setThreshold(consoleAppenderToThreshold.get(ca))
+ // SparkShellLoggingFilter is the last filter
+ ca.getFirstFilter() match {
+ case ssf: SparkShellLoggingFilter =>
+ ca.clearFilters()
+ case f: org.apache.log4j.spi.Filter =>
+ var previous = f
Review comment:
I think this block should be the only one you have. It would someone
manually adding filters outside of Spark, and also fixes this match not being
exhaustive (it doesn't handle `ca.getFirstFilter()` being null, which shouldn't
happen but can if code outside Spark messes with the appender directly, like
this code).
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]