Paul, thanx for your suggestions.  It seems like they mostly address the
issue of improving search time, by eliminting the need to read the norm
files from disk -- but the spead of the query isn't as big of a concern
for us as the memory footprint.

As I understand it, the point when we are really pushing the limits on
available memory come during optimize - particularly after user behavior
has resulted in deleting/re-adding a large percentage of the documents in
the index.

: For really large indexes the norms might become a bottleneck for
: when building them, but iirc this was improved recently.

Our production system runs 1.4.3 .... perhaps we should try some stress
tests with 1.9 and see what happens.


-Hoss


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to