Hi, I'm just experimenting with the use of Lucy to index the data I formally store in relational (MySQL) databases. I'm just taking the text from the db and putting into Lucy stores. Each directory gets its own directory so that it's easy for me to update just part of my search index when a db changes. So far, I've processed about 30 dbs into a total of about 2.3GB of Lucy indexes. The problem is that my machine (a pretty decent, dual-core Linux host) keeps running out of memory, esp. when indexing a large db with 100K+ records. My sysadmin keeps killing it as the it will take down the machine.
I'm using the latest Perl and Lucy source. Any ideas? Ken
