Hey,

I have changed the user.log_retain size to 10 MB still it is creating a huge size of logs. This leads to the failure of datanode and the job fails. And, if the logs are deleted periodically then the fetch phase takes a lot of time and it is uncertain that whether it will complete or not.

Shubham Gupta

On Wednesday 24 August 2016 05:20 PM, Markus Jelsma wrote:
If it is Nutch logging, change its level in conf/log4j.properties. It can also 
be Hadoop logging.
M.
-----Original message-----
From:shubham.gupta <shubham.gu...@orkash.com>
Sent: Tuesday 23rd August 2016 8:15
To: user@nutch.apache.org
Subject: Application creating huge amount of logs : Nutch 2.3.1 + Hadoop 2.7.1

Hey

I have integrated Nutch 2.3.1 with Hadoop 2.7.1, and the fetcher.parse
property is set TRUE and the database used is MongoDB. While the map job
of nutch runs, it creates a huge size of nodelogs over 13GB in size. And
the cause of such huge amount of files in unknown. Any suggestion would
help.

Thanks in advance.

Shubham Gupta


Reply via email to