You can segment your indexes into n physical parts (perhaps 4), then index those n parts concurrently. When you query you will use some kind of mulit searcher to span the parts. The one thing you may care about is that if you are going todo a recrawl / update of documents against the existing index, then you will need to have a reproducible way of hashing your documents over the index (assuming you are deleting the previous document).
Hope that helps Chris --- Kevin Burton <[EMAIL PROTECTED]> wrote: > Is it possible to get Lucene to do an index optimize on multiple > processors? > > Its a single threaded algorithm currently right? > > Its a shame since I have a quad machine but I'm only using 1/4th of the > capacity. Thats a heck of a performance hit. > > Kevin > > -- > > > Use Rojo (RSS/Atom aggregator)! - visit http://rojo.com. > See irc.freenode.net #rojo if you want to chat. > > Rojo is Hiring! - http://www.rojonetworks.com/JobsAtRojo.html > > Kevin A. Burton, Location - San Francisco, CA > AIM/YIM - sfburtonator, Web - http://peerfear.org/ > GPG fingerprint: 5FB2 F3E2 760E 70A8 6174 D393 E84D 8D04 99F1 4412 > > > --------------------------------------------------------------------- > To unsubscribe, e-mail: [EMAIL PROTECTED] > For additional commands, e-mail: [EMAIL PROTECTED] > > --------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]