vanzin commented on a change in pull request #23675: [SPARK-26753][CORE] Fixed 
custom log levels for spark-shell by using Filter instead of Threshold
URL: https://github.com/apache/spark/pull/23675#discussion_r252005243
 
 

 ##########
 File path: core/src/main/scala/org/apache/spark/internal/Logging.scala
 ##########
 @@ -213,7 +205,21 @@ private[spark] object Logging {
         rootLogger.setLevel(defaultRootLevel)
         rootLogger.getAllAppenders().asScala.foreach {
           case ca: ConsoleAppender =>
-            ca.setThreshold(consoleAppenderToThreshold.get(ca))
+            // SparkShellLoggingFilter is the last filter
+            ca.getFirstFilter() match {
+              case ssf: SparkShellLoggingFilter =>
+                ca.clearFilters()
+              case f: org.apache.log4j.spi.Filter =>
+                var previous = f
 
 Review comment:
   I still think the match is not correct. e.g. if the first filter is the one 
you added, you remove all filters, regardless of whether it's the first one or 
not.
   
   The code in this `case` correctly handles all possible combinations.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to