Merlin Beedell at Demon wrote:
I hope that this is a common enough request that some results and comments already exist. Just point me in the right direction!

I would like to know how Derby performs (inserts and search) where the record count goes from 1 to 1,000 million or greater, particularly if there are foreign key constraints included. And in comparison with other databases.

I fully recognise that such tests are quite subjective and vary greatly depending on various settings that do not change the sql used.

We have a situation where Derby performs well up to 100,000 records or so, then (insert) performance plummets. It uses a fairly large text foreign key, and we are changing this to an auto-number key instead, which we know will help. The aim to to have good performance up to 6Gb worth of data (approx 60-120 million rows over a small number of tables).

Is your page cache larger than the default size? On older versions of Derby there is a problem with large page caches in that the performance degrades while filling the page cache. When the page size has been filled, the performance will increase again.

Some experience from other developers may well help us move forward, as well as any reasoned test results that shows that Derby is a good choice for us.


I have inserted 15 GB of data in a Derby database and insert performance was pretty stable.

--
Øystein Grøvlen, Senior Staff Engineer
Sun Microsystems, Database Technology Group
Trondheim, Norway

Reply via email to