Thanks!
These are great suggestions for someone limited to java!

Christian
 

-----Original Message-----
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf
Of Larry Meadors
Sent: Monday, December 17, 2007 9:30 PM
To: [email protected]
Subject: Re: Best way to insert thousands of records

Yeah, a native tool will kick some serious booty on this task.

If you can't use a native tool, here are some other tricks you can do to
really get it cracking.

Use a batch, but commit every 100 or 1000 or 10000 records - play with
this value, it'll be different in different environments. If you don't
commit periodically, the rollback data gets big and can bog down the
database. If this is an all or nothing transaction, you can still do
this..but do it into a temporary table, then use a stored procedure (or
single statement) to move the records from the temp table to the real
one.

Disable constraints or drop indexes (or both) on the table you are
inserting into, then validate the input and bring them back at the end.
This is what some bulk loaders do, too. If you can't do this, the same
temp table approach above can work for this, too.

The idea with the temp table is to get the data into the database (even
if it's in the wrong place) then do the juggling there - odds are if the
data set is huge (300k rows), it'll be faster to do stuff with it in the
database then manipulate it.

Larry


On Dec 17, 2007 3:37 PM, Poitras Christian
<[EMAIL PROTECTED]> wrote:
> Thanks a lot both of you!
> That a possibility I didn't check. I would need to create a temp file 
> (or some sort of input), but performance is likely to be more 
> interesting.
>
> Christian
>
>
> -----Original Message-----
> From: Christopher Lamey [mailto:[EMAIL PROTECTED]
> Sent: Monday, December 17, 2007 5:31 PM
> To: [email protected]
> Subject: Re: Best way to insert thousands of records
>
>
> Yea, that's immediately what I thought - don't go through Java.  Each 
> RDBMS has its own bulk import tool.  Oracle has sqlldr, MS SQL Server 
> has bc.exe, postgres has the copy statement.  They are much much 
> faster than going the JDBC at all.
>
> On 12/17/07 3:14 PM, "Larry Meadors" <[EMAIL PROTECTED]> wrote:
>
> > Do we have to use iBATIS...or Java for that matter?
> >
> > Larry
> >
> >
> > On Dec 17, 2007 3:00 PM, Poitras Christian 
> > <[EMAIL PROTECTED]>
> > wrote:
> >>
> >>
> >> Hi!
> >>
> >> I would like to have your opinion on the fastest way to insert 300k

> >> rows into a database.
> >>
> >> Here are 2 ways I tought about.
> >> Method 1 :
> >> startBatch();
> >> for (int i = 0; i < myObjects.size(); i++) {
> >>   insert("MyObject.insert", myObjects.get(i)); } executeBatch();
> >>
> >>
> >> SqlMap
> >> <insert id="insert">
> >> INSERT INTO TABLE MyObject(prop1, prop2) VALUES (#prop1.id#,
> >> #prop2.id#) </insert>
> >>
> >>
> >>
> >> Method 2 :
> >> startBatch();
> >> for (int i = 0; i < myObjects.size(); i += 30000) {
> >>   insert("MyObject.insert", myObjects.subList(i, Math.min(i + 
> >> 30000, myObjects.size()))); } executeBatch();
> >>
> >>
> >>
> >> SqlMap
> >> <insert id="insert">
> >> INSERT INTO TABLE MyObject(prop1, prop2) VALUES (#prop1.id#,
> >> #prop2.id#) </insert>
> >>
> >>
> >> Maybe I'm way off... So I would like to know about your
experiences.
> >>
> >> Thanks
> >> Christian
>
>

Reply via email to