..and I need help - please. I've made it this far by what I can find on the mailing-lists, but have bumped into my good friend, the 2GB limit on a 64bit Compaq6400r running SuSE9.1. Doesn't seem to wanna leave me alone. Moved up from a 32 bit to ditch him, but 3.1.6 still finds that guy when running htdig. I really thought he'd get the message when I dumped Redhat for SuSE.
I found on the lists that Berthold Cogel says ' Perhaps using -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 during build solves our problem? ' He got a 5.5 gig file after he did this trick. I'm glad that he was able to solve his problem, but I have no idea what to do with this. Maybe you can help me. All I would need is an example of where and how to apply this. Real simple and easy to understand - I can make things run, but I don't know how to build an engine. It seems to of worked for a few guys and I'd like to be one of them. I've got 140,000 files to crawl and serve. I know that Geoff Hutchison sez use 3.2 and the Berkeley db, but I dunno - I'm real familiar, comfortable with 3.1.6 and would have separation anxiety - I think, but I dunno - oh, PLEASE help me get this 2gb monkey off my back...... Thanks James Garrett ------------------------------------------------------- SF.Net email is sponsored by Shop4tech.com-Lowest price on Blank Media 100pk Sonic DVD-R 4x for only $29 -100pk Sonic DVD+R for only $33 Save 50% off Retail on Ink & Toner - Free Shipping and Free Gift. http://www.shop4tech.com/z/Inkjet_Cartridges/9_108_r285 _______________________________________________ ht://Dig general mailing list: <[EMAIL PROTECTED]> ht://Dig FAQ: http://htdig.sourceforge.net/FAQ.html List information (subscribe/unsubscribe, etc.) https://lists.sourceforge.net/lists/listinfo/htdig-general