I should've also mentioned that I have tried this with compression off. I also have not added any real changes to the config file ... running 3.2.0b4 on Redhat Linux 8.0
Thanks! Abbie ____________________ Lately I've been getting a lot of segmentation faults when try to run HTDig against a set of certain files, only .txt or .html. The files are rather large, they are 10K filings. However I never seem to receive any reason why it fails. I've run HTDig on larger sets of these files, and it has run fine. I'm starting to wonder if there is something I should recompile it with for testing? I'm a newbie, so any help is truly appreciated! If I do recompile it, do I just remove all instance of htdig on my machine? I'd like to 'save the old databases' for searchability reasons though. The sets that Htdig runs on usually takes about 2 days to run, and I have to update these databases every week with new data. We haven't started this cycle yet, but I'd like to try and smooth everything out in regards to these segmentation faults before we start anything...any help is truly appreciated! Thanks! Abbie ------------------------------------------------------- This SF.net email is sponsored by: ValueWeb: Dedicated Hosting for just $79/mo with 500 GB of bandwidth! No other company gives more support or power for your dedicated server http://click.atdmt.com/AFF/go/sdnxxaff00300020aff/direct/01/ _______________________________________________ htdig-general mailing list <[EMAIL PROTECTED]> To unsubscribe, send a message to <[EMAIL PROTECTED]> with a subject of unsubscribe FAQ: http://htdig.sourceforge.net/FAQ.html

