--On Dienstag, 27. August 2002 09:01 -0500 Searcher <[EMAIL PROTECTED]> 
wrote:

> I posted a message about a week ago asking if anyone knew why a URL
> list of 10,028 URL's would prevent htdig from running. Yet, when I
> cut the list back to 5000 URL's at a time, it seems to work fine.
>
> I've heard that some folks have 1.5 millions URL's in their lists.
> Does anyone have any thoughts on this?

Might depend on the type of URL list.
I guess it makes as differnce if you have a flat list
with many 1000 URLs in one file or if the links are
hierachically organized. At least that was my impression
a while ago. htdig (or, IIRC actually "sort") always died
when fed with a (rewritten) output of a "find ... -name *.html"
on a 4.3 BSD. Roughly 5GB in ~ 130000 files.

Cheers, Marcel



-------------------------------------------------------
This sf.net email is sponsored by: OSDN - Tired of that same old
cell phone?  Get a new here for FREE!
https://www.inphonic.com/r.asp?r=sourceforge1&refcode1=vs3390
_______________________________________________
htdig-general mailing list <[EMAIL PROTECTED]>
To unsubscribe, send a message to <[EMAIL PROTECTED]> with a 
subject of unsubscribe
FAQ: http://htdig.sourceforge.net/FAQ.html

Reply via email to