Brian Whitman wrote:
> On Jan 15, 2007, at 1:36 PM, Andrzej Bialecki wrote:
>> Brian Whitman wrote:
>>> (nutch-nightly, hadoop 0.9.1)
>>>
>>> The file indicated (bad_files/data.-931801681) is a 255MB binary 
>>> file -- running strings on it shows a lot of URIs. There's also a 
>>> 2MB .data.crc-931801681 file, all binary.
>>>
>>> Any idea how this happened or how to avoid?
>>
>> Are you running this on an NFS volume, using LocalFileSystem? You 
>> aren't running out of disk space by any chance?
>
>
> No, it's a RAID-5 local disk
>
> /dev/md0              1.5T  193G  1.2T  14% /array

Ok, then I don't know what's causing it. There are two options, really - 
either the file is really corrupted, or there is a subtle bug in Hadoop 
somewhere. Since this is a local FS, you can try removing the .crc file 
and see if it helps (Hadoop should rebuild this file when it's needed).

-- 
Best regards,
Andrzej Bialecki     <><
 ___. ___ ___ ___ _ _   __________________________________
[__ || __|__/|__||\/|  Information Retrieval, Semantic Web
___|||__||  \|  ||  |  Embedded Unix, System Integration
http://www.sigram.com  Contact: info at sigram dot com



-------------------------------------------------------------------------
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
_______________________________________________
Nutch-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/nutch-general

Reply via email to