It might help to add a startup hack that complains somewhere if SLF4J
is not hooked up right.

On Sat, Aug 13, 2011 at 3:51 PM, Ted Dunning <[email protected]> wrote:
> Also, SLF4J is not a logging package, but a logging facade.  I think that
> you need slf4j *and* a specific logging package.  You have slf4j-log4j, but
> it isn't clear you have have log4j.  You may prefer to put the static slf4j
> binding in the class path to avoid the need for log4j.
>
> Logging packages are generally a pain in the butt since they tend to be kind
> of fancy in what they require on the classpath.
>
> On Sat, Aug 13, 2011 at 3:18 PM, Sean Owen <[email protected]> wrote:
>
>> The unhelpful comment is that, well, the problem is exactly what you've
>> already said: there is no SLF4J binding in the classpath where it needs to
>> be. I suspect you're perhaps mixing up classpaths. Which actual Java task
>> isn't printing -- the Hadoop worker? are you sure the bindings are in its
>> classpath, versus say the classpath of the runner program?
>>
>> On Sat, Aug 13, 2011 at 9:39 PM, Dhruv Kumar <[email protected]> wrote:
>>
>> > I am running into a strange problem which has started happening only
>> after
>> > I
>> > did a svn update on Mahout's trunk this morning.
>> >
>> > For testing out MAHOUT-627, I have created some client code using
>> IntelliJ
>> > IDEA. In the jar artifact, I have added Mahout and Hadoop-0.20.203.0
>> > libraries as dependencies.
>> >
>> > If I run my code using:
>> >
>> > dhruv@tachyon:~/hadoop-0.20.203.0/bin$ ./hadoop -jar
>> > ~/TestBaumWelch/out/artifacts/TestBaumWelch_jar/TestBaumWelch.jar
>> > TestBaumWelch.TestBaumWelch
>> >
>> > I am unable to see any logs on the terminal or on the MapReduce web admin
>> > console (http://localhost:50030/jobtracker.jsp). The admin console does
>> > not
>> > even show that any job is running.
>> >
>> > But, I can clearly tell that the program is executing, since I get a
>> series
>> > of map 100%, reduce 100% and other INFO messages one after the other and
>> > because it produces results in the output directories.
>> >
>> > However, when I run the Hadoop's canonical Word Count example which does
>> > not
>> > depend on Mahout, I am able to see logs via the web admin console and
>> > everything goes OK.
>> >
>> >
>> > I suspect this is because of SLF4J on my system, there may be more than
>> one
>> > class binding or none at all. Here are the warning messages emitted soon
>> > after I run the program:
>> >
>> > SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
>> > SLF4J: Defaulting to no-operation (NOP) logger implementation
>> > SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for
>> further
>> > details.
>> >
>> >
>> > However, slf4j is clearly on the classpath.
>> >
>> > For hadoop, a portion of echo $CLASSPATH reveals:
>> >
>> > :/home/dhruv/hadoop-0.20.203.0/bin/../lib/slf4j-log4j12-1.4.3.jar
>> >
>> > For mahout:
>> >
>> >
>> >
>> /home/dhruv/mahout/examples/target/dependency/slf4j-api-1.6.1.jar:/home/dhruv/mahout/examples/target/dependency/slf4j-jcl-1.6.1.jar:/home/dhruv/mahout/examples/target/dependency/slf4j-log4j12-1.6.1.jar:
>> >
>> >
>> > Also, during Mahout's "mvn install" the slf4j is being added since I can
>> > see
>> > the following messages:
>> >
>> > [INFO] org/slf4j/ already added, skipping
>> > [INFO] org/slf4j/impl/ already added, skipping
>> > [INFO] org/slf4j/impl/StaticLoggerBinder.class already added, skipping
>> > [INFO] org/slf4j/impl/StaticMarkerBinder.class already added, skipping
>> > [INFO] org/slf4j/impl/StaticMDCBinder.class already added, skipping
>> >
>> > Has any one else run into a problem like this before?
>> >
>>
>



-- 
Lance Norskog
[email protected]

Reply via email to