is your machine cpu bound during the insert?  If not having log
on separate disk may help.

Does your app always load a large number of entries into an empty table, or does it also load a large number of entries into an existing table
with entries.  If loading into an empty table then using one of the
import system routines with a vti would avoid logging.

SOA Work wrote:
Hi there,

I'm currently using derby to log values applying in my application.
One value consists of a string, a date object and a double value. Not much 
data. But I want to insert  a huge amount of entries.

At the moment inserting 10 000 entries takes about 2 seconds. Now I would like 
to here your opinion about how to improve performance (if possible).

Here the ideas I'm currently using (comment if you want)

1.) I use a table without indexes, primary keys and so on. "create table log(id varchar(50), value double, timestamp date)"

2.) I use the embedded mode

3.) I use a prepared statement "insert into log values (?, ?, ?)"

4.) I disabled auto commit and use addBatch for all 10000 entries. at the end I 
call executeBatch() and commit()

I would like to here your suggestions or comments.
Best regards

______________________________________________________________________
XXL-Speicher, PC-Virenschutz, Spartarife & mehr: Nur im WEB.DE Club!            
    
Jetzt gratis testen! http://freemail.web.de/home/landingpad/?mc=021130




Reply via email to