I have a theory question now.. is there any limitations to the size or number of records that HtDig can perform? Is there an OS limit in FreeBSD for example? Thanks, Paul Geoff Hutchison wrote: > > On Tue, 21 Dec 1999, Sean Pecor wrote: > > > <grin>. I had a fairly large development project that required a search > > engine capable of handling approximately 150,000 internal pages and 10,000+ > > pages on thousands of external web servers. Despite being warned by the FAQ > > that htdig wasn't built for this task, I couldn't resist giving it a go; > > Our of curiousity, where does it say this? I see: > > 1.2. Can I index the internet with ht://Dig? > > No, as above, ht://Dig is not meant as an internet-wide search engine. > While there is theoretically nothing to stop you from indexing as much as > you wish, practical considerations (e.g. time, disk space, memory, etc.) > will limit this. > > (BTW, we have lots of people indexing well above 150,000 URLs.) > > Cheers, > -Geoff Hutchison > Williams Students Online > http://wso.williams.edu/ > > ------------------------------------ > To unsubscribe from the htdig mailing list, send a message to > [EMAIL PROTECTED] > You will receive a message to confirm this. ------------------------------------ To unsubscribe from the htdig mailing list, send a message to [EMAIL PROTECTED] You will receive a message to confirm this.
Re: [htdig] One solution for slow dig on Linux.
Premier Hosting Administrator Tue, 21 Dec 1999 09:27:23 -0800
- [htdig] One solution for slow dig on Linux. Sean Pecor
- Re: [htdig] One solution for slow dig o... Torsten Neuer
- RE: [htdig] One solution for slow d... Sean Pecor
- Re: [htdig] One solution for sl... Torsten Neuer
- Re: [htdig] One solution for slow dig o... Geoff Hutchison
- RE: [htdig] One solution for slow d... Sean Pecor
- Re: [htdig] One solution for slow d... Premier Hosting Administrator
- Re: [htdig] One solution for sl... Geoff Hutchison
- Re: [htdig] One solution for sl... Doug Barton
