Matthew Byng-Maddick created SPARK-14703:
--------------------------------------------

             Summary: Spark uses SLF4J, but actually relies quite heavily on 
Log4J
                 Key: SPARK-14703
                 URL: https://issues.apache.org/jira/browse/SPARK-14703
             Project: Spark
          Issue Type: Improvement
          Components: Spark Core, YARN
    Affects Versions: 1.6.0
         Environment: 1.6.0-cdh5.7.0, logback 1.1.3, yarn
            Reporter: Matthew Byng-Maddick
            Priority: Minor


We've built a version of Hadoop CDH-5.7.0 in house with logback as the SLF4J 
provider, in order to send hadoop logs straight to logstash (to handle with 
logstash/elasticsearch), on top of our existing use of the logback backend.

In trying to start spark-shell I discovered several points where the fact that 
we weren't quite using a real L4J caused the sc not to be created or the YARN 
module not to exist. There are many more places where we should probably be 
wrapping the logging more sensibly, but I have a basic patch that fixes some of 
the worst offenders (at least the ones that stop the sparkContext being created 
properly).

I'm prepared to accept that this is not a good solution and there probably 
needs to be some sort of better wrapper, perhaps in the Logging.scala class 
which handles this properly.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to