That depends (tm Erik Hatcher) <G>. The problem with such an open-ended question is that there are so many unique variables that it's impossible to answer in any meaningful way. For instance....
How much work is it to parse the log files? What kind if hardware are you using? Are you accessing things over a network? Is there network latency involved? Can you index various parts in parallel? How many document are we talking about? You've given no data on whether you expect 100 document or 1,000,000,000,000,000 documents? How fast is the data being added to your syslogs? and on and on and on. All you can do is set up a test to see. It shouldn't be very hard to, say, create a small program that randomizes input and push it through the indexing process and measure. Best Erick On 1/29/07, Saravana <[EMAIL PROTECTED]> wrote:
Hi, Did anybody use lucene to index syslogs? What is the maximum indexing rate that we can get to store a 200 bytes document with 14 fields? thanks, MSK -- Every day brings us a sea of opportunities