Sleepycat Software writes:
> One other comment, to state what's probably obvious -- the Huffman
> encoding helps in two different ways: it not only reduces your disk space
> requirements, but it also increases your page-fill-factor by increasing
> the density of your data on the page. The compression during I/O scheme
> only helps with disk space, it does nothing to increase the density of
> information on the page. It will, however, help more with disk space than
> the Huffman scheme will.
I'm not sure to understand. Why is the page entry density more dense if
you compress individual entries ? Assuming I have a 4k page with 1000
compressed entries, why would it be more dense than a 8k page in memory,
4k page on disk containing 1000 entries ? I must be missing something
important here.
> It seems to me that any scheme to increase the density on the page is
> going to require per key/data decompression in the comparison function.
Yes. *sigh*.
--
Loic Dachary
ECILA
100 av. du Gal Leclerc
93500 Pantin - France
Tel: 33 1 56 96 09 80, Fax: 33 1 56 96 09 61
e-mail: [EMAIL PROTECTED] URL: http://www.senga.org/
------------------------------------
To unsubscribe from the htdig3-dev mailing list, send a message to
[EMAIL PROTECTED] containing the single word "unsubscribe" in
the SUBJECT of the message.