[
https://issues.apache.org/jira/browse/SPARK-14703?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15245870#comment-15245870
]
Sean Owen commented on SPARK-14703:
-----------------------------------
I'm not sure this is possible in general, because SLF4J provides no way to
configure loggers, and Spark needs to configure loggers. It does mean it
directly manipulates log4j. If you switch backends, it works, but these changes
from Spark don't do anything.
I don't think your patch removes use of log4j? We don't use patches anyway.
https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark
> Spark uses SLF4J, but actually relies quite heavily on Log4J
> ------------------------------------------------------------
>
> Key: SPARK-14703
> URL: https://issues.apache.org/jira/browse/SPARK-14703
> Project: Spark
> Issue Type: Improvement
> Components: Spark Core, YARN
> Affects Versions: 1.6.0
> Environment: 1.6.0-cdh5.7.0, logback 1.1.3, yarn
> Reporter: Matthew Byng-Maddick
> Priority: Minor
> Labels: log4j, logback, logging, slf4j
> Attachments: spark-logback.patch
>
>
> We've built a version of Hadoop CDH-5.7.0 in house with logback as the SLF4J
> provider, in order to send hadoop logs straight to logstash (to handle with
> logstash/elasticsearch), on top of our existing use of the logback backend.
> In trying to start spark-shell I discovered several points where the fact
> that we weren't quite using a real L4J caused the sc not to be created or the
> YARN module not to exist. There are many more places where we should probably
> be wrapping the logging more sensibly, but I have a basic patch that fixes
> some of the worst offenders (at least the ones that stop the sparkContext
> being created properly).
> I'm prepared to accept that this is not a good solution and there probably
> needs to be some sort of better wrapper, perhaps in the Logging.scala class
> which handles this properly.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]