Hey Wenlei, There is some issue in master that is repressing the log output - I'm trying to debug it before we release 0.8.1. Can you explain exactly how you are running Spark? Are you running the shell or are you running a standalone application?
- Patrick On Thu, Nov 28, 2013 at 12:54 AM, Wenlei Xie <[email protected]> wrote: > Hi, > > I remember Spark used to print detailed error log into the stderr (e.g. > constructing RDD, evaluate it, how much memory each partition consumes). But > I cannot find it anymore but only with the following information: > > SLF4J: Class path contains multiple SLF4J bindings. > SLF4J: Found binding in > [jar:file:/u/ytian/wenlei/dynamicGraph/graphx/examples/target/scala-2.9.3/spark-examples-assembly-0.9.0-incubating-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: Found binding in > [jar:file:/u/ytian/wenlei/dynamicGraph/graphx/assembly/target/scala-2.9.3/spark-assembly-0.9.0-incubating-SNAPSHOT-hadoop1.0.4.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an > explanation. > SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] > log4j:WARN No appenders could be found for logger > (akka.event.slf4j.Slf4jEventHandler). > log4j:WARN Please initialize the log4j system properly. > log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for > more info. > > > What should I do for it? > > Thanks, > > Wenlei
