using prepared statement + batch will be your fastest approach even
tough you pass 1000 records per query derby has to still parse that
query whereas with prepared statements they are parsed once. and if
you are using batch all your data will be passed at once so derby
will make better use of your disk drive( instead of flushing index
after each insert it will do it once for all batch this was the case
in mysql i maybe wrong for derby)
Along these lines I did some testing for our own application.
Cached PreparedStatement but one transaction per insert:
100 tags added in 399ms
100 tags removed in 160ms
1000 tags added in 1163ms
1000 tags removed in 873ms
10000 tags added in 6094ms
10000 tags removed in 6840ms
100000 tags added in 58563ms
100000 tags removed in 67342ms
All in one transaction using executeUpdate():
100 tags added in 274ms
100 tags removed in 70ms
1000 tags added in 299ms
1000 tags removed in 250ms
10000 tags added in 1605ms
10000 tags removed in 1500ms
100000 tags added in 14441ms
100000 tags removed in 19721ms
All in one transaction using addBatch()/executeBatch():
100 tags added in 290ms
100 tags removed in 76ms
1000 tags added in 316ms
1000 tags removed in 258ms
10000 tags added in 1621ms
10000 tags removed in 1927ms
100000 tags added in 14971ms
100000 tags removed in 19320ms
So it certainly seems like batching in itself has no real benefit but
reducing the number of transactions has dramatic benefits.
Of course the numbers themselves are meaningless without redoing the same
test on your own application.
Daniel Noll
Nuix Pty Ltd
Suite 79, 89 Jones St, Ultimo NSW 2007, Australia Ph: +61 2 9280 0699
Web: http://nuix.com/ Fax: +61 2 9212 6902
This message is intended only for the named recipient. If you are not
the intended recipient you are notified that disclosing, copying,
distributing or taking any action in reliance on the contents of this
message or attachment is strictly prohibited.