Thomas,
Thank you, I have not tried the batch feature yet. But you have
given me some great stuff to work with.
I will have to try this and compare the timings to the single
INSERT string approach below. I may also have to adjust the
commit number up/down to achieve optimal performance.
Iterate an array of 1000 records and create a VALUES clause
string to look like:
/String records = "(1,1,1), (2,2,2), (3,3,3), (4,4,4), …(1000,
1000, 1000)";/
Then append the insert statement with the records and execute.
/String query = "INSERT INTO t VALUES " + records;/
/Execute query;/
- Derek
---------------------------------------------------------------------
---
*From:* Thomas J. Taylor [mailto:[EMAIL PROTECTED]
*Sent:* Wednesday, March 14, 2007 12:01 PM
*To:* 'Derby Discussion'
*Subject:* RE: Large multi-record insert performance
Hi Derek,
Have you tried batch inserts?
PreparedStatement pstmt = connection.prepareStatement("INSERT
INTO t VALUES (?, ?, ?)");
for (int i=1; i<1000; i++) {
pstmt.setInt(1, i);
pstmt.setInt(2, i);
pstmt.setInt(3, i);
pstmt.addBatch();
}
int[] results = pstmt.executeBatch();
Thomas
---------------------------------------------------------------------
---
*From:* Sedillo, Derek (Mission Systems)
[mailto:[EMAIL PROTECTED]
*Sent:* Wednesday, March 14, 2007 11:06 AM
*To:* [email protected]
*Subject:* Large multi-record insert performance
Hello,
I am trying to find the most efficient way to handle multi-records
inserts into a single table in Derby. Currently we use Oracle to
perform multi-record inserts with 'one' insert statement for large
amounts of data. This bulk loading process greatly enhances
performance. For example:
/INSERT INTO SSJ
VALUES (:tmp_ssj_data); / //where* tmp_ssj_data* is an array (100s
or 1000s) of C structured records
Is there any way in Java DB to perform something similar? So lets
say table t has three integer columns.
/INSERT INTO t VALUES (?,?,?)/
Now you have 1000 rows to insert into t that look like this:
(1,1,1), (2,2,2), (3,3,3), (4,4,4), …
How would you pass these 1000 records in as VALUES with a 'single'
insert statement?
I am thinking of two options so far:
1. Use a prepared statement and 'set' all 1000 record values.
Then execute. But I question whether calling the set method
for all 1000 records would be buy anything in terms of performance.
2. The other option would be to iterate an array of 1000 records
and create a VALUES clause string to look like:
/String records = "(1,1,1), (2,2,2), (3,3,3), (4,4,4), …(1000,
1000, 1000)";/
Then append the insert statement with the records and execute.
/String query = "INSERT INTO t VALUES " + records;/
/Execute query;/
Which method would be the most efficient way to perform these 1000
inserts in Derby? Is there a more optimized way to do this?
Thank you for any experience or ideas you may have regarding large
data inserts.
Derek Sedillo
SWAFS DBA / Software Engineer
*Northrop Grumman Missions Systems*
Tel: (719) 570-8256