Currently using 0.7.2.

We have a process that runs crawltool from within an application,
perhaps hundreds of times during the course of the day.  The problem I
am seeing is that over time the log statements from my application (I
am using commons logging and Log4j) are also being logged within the
nutch log.  But, the real problem is that over time each log statement
gets repeated by some factor that increases over time/calls.  So,
currently, if I have a debug statement after I call CrawlTool.main(),
I will get 7500 entries in the log for that one statement.  I see a
'memory leak' in the application as this happens because I eventually
run out of it (1.5GB).  Has anyone else seen this problem?  I have to
keep shutting down the app so I can continue.

Any clues?  Does nutch create log appenders in the crawler code, and
is this causing the problem?





-- 
"Concious decisions by concious minds are what make reality real"

-------------------------------------------------------------------------
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys-and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
_______________________________________________
Nutch-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/nutch-general

Reply via email to