Have recently built started to use Lucy (with Perl) and everything went well until I tried to index a large file store (>300,000 files). The indexer process reached >8Bbytes and the machine ran out of resources. My questions are:

a) Is this the normal resources requirements?

b) Is there a way to avoid swamping machines?

I also found that the searcher becomes very large for large indexes and as ours runs as a part of a FastCGI process it exceeded the ulimit of the process. Upping the ulimit fixed this, but diagnosing the issue was difficult as the query would just return 0 results rather than indicating that it had run out of procees space.

Many thanks

Edwin Crockford

Reply via email to