Hey Hadi,

I had this error message several times, for different reasons but never
because of disk space.

I would suggest you run smaller crawls just to narrow down the issue. Start
with Top 1, then 10, ...

Remi

On Sunday, February 19, 2012, Lewis John Mcgibbney <
[email protected]> wrote:
> Can you please paste how you have specified your hadoop temp dir. This
> seems to be the cause of such stack trace error's
>
> Thanks
>
> On Sun, Feb 19, 2012 at 7:04 AM, hadi <[email protected]> wrote:
>
>> yes,there is a hadoop log :
>>
>>
>>
>> i search this error but everyone says this error is about low space but i
>> specify a large one
>>
>> --
>> View this message in context:
>>
http://lucene.472066.n3.nabble.com/IOExeption-when-crawling-with-nutch-in-Fetching-process-tp3756272p3757564.html
>> Sent from the Nutch - User mailing list archive at Nabble.com.
>>
>
>
>
> --
> *Lewis*
>

Reply via email to