On Feb 25, 2004, at 7:58 PM, sam xia wrote:

I'd recommend a pool of filters for each category.
Regenerate them
when the index changes, otherwise leave the
instances alive and reuse
them for queries - this will speed things up pretty
dramatically I'd
guess.  There is a QueryFilter you could use, or
write a custom one
that could be faster.

Erik

thanks for your quick response, Erik. BTW, I checked
your web site, the logo is cool.

haha.... I'll take that a hint that I should put more there than a "splash" page :)


Is there a way to store the filter cache to hard
drive? Then I can just read it from hard drive. Since
I have lots of categories, it might be impossible to
cashe every query filter bitsets in memory.

BitSet's are tiny - so you'd have to have a *lot* of categories to fill up a decent sized RAM.


But Filter is Serializable, so you could (in theory) persist them somewhere.

How about I keep every category to one segment?

I'm not sure how you'd ensure that. Never optimize?


You could have a separate index per category, but you already mentioned that I think and it would take up file handles.

 Then I
have 10000 segements to work with. Is it going to be
faster than the filter solution?

Filters are fast.


Erik


--------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]



Reply via email to