On 12/31/06, Nikola Milutinovic <[EMAIL PROTECTED]> wrote:
> 1. There is no difference (speed-wise) between committing every 1K or
every 250K rows.
It was really some time ago, since I have experimented with this. My las
experiment was on PG 7.2 or 7.3. I was inserting cca 800,000 rows.
Inserti
> It was really some time ago, since I have experimented with this. My las
> experiment was on PG
> 7.2 or 7.3. I was inserting cca 800,000 rows. Inserting without transactions
> took 25 hrs.
> Inserting with 10,000 rows per transaction took about 2.5 hrs. So, the
> speedup was 10x. I have
> not
> 1. There is no difference (speed-wise) between committing every 1K or every
> 250K rows.
It was really some time ago, since I have experimented with this. My las
experiment was on PG 7.2 or 7.3. I was inserting cca 800,000 rows. Inserting
without transactions took 25 hrs. Inserting with 10,00
Frank Finner wrote:
In Java, assuming you have a Connection c, you simply say
"c.commit();" after doing some action on the database. After every
commit, the transaction will be executed and closed and a new one
opened, which runs until the next commit.
Assuming, of course, you started with c.se
> The fastest way will be copy.
> The second fastest will be multi value inserts in batches.. eg.;
>
> INSERT INTO data_archive values () () () (I don't knwo what the max is)
>
> but commit every 1000 inserts or so.
Is this some empirical value? Can someone give heuristics as to how to
calculate
Frank Finner wrote:
In Java, assuming you have a Connection c, you simply say "c.commit();" after
doing some action on the database. After every commit, the transaction will be executed
and closed and a new one opened, which runs until the next commit.
Regards, Frank.
That did it, thank
In Java, assuming you have a Connection c, you simply say "c.commit();" after
doing some action on the database. After every commit, the transaction will be
executed and closed and a new one opened, which runs until the next commit.
Regards, Frank.
On Fri, 29 Dec 2006 13:23:37 -0500 James Neff
Joshua D. Drake wrote:
On Fri, 2006-12-29 at 13:21 -0500, James Neff wrote:
Joshua D. Drake wrote:
Also as you are running 8.2 you can use multi valued inserts...
INSERT INTO data_archive values () () ()
Would this speed things up? Or is that just another way to do it?
The fastest way wi
On Fri, 2006-12-29 at 13:21 -0500, James Neff wrote:
> Joshua D. Drake wrote:
> > Also as you are running 8.2 you can use multi valued inserts...
> >
> > INSERT INTO data_archive values () () ()
> >
>
> Would this speed things up? Or is that just another way to do it?
The fastest way will be c
Joshua D. Drake wrote:
You need to vacuum during the inserts :)
Joshua D. Drake
I ran the vacuum during the INSERT and it seemed to help a little, but
its still relatively slow compared to the first 2 million records.
Any other ideas?
Thanks,
James
---(end of br
When do you commit these inserts? I occasionally found similiar problems, when
I do heavy inserting/updating within one single transaction. First all runs
fast, after some time everything slows down. If I commit the inserts every some
1000 rows (large rows, small engine), this phenomenon does no
Joshua D. Drake wrote:
Also as you are running 8.2 you can use multi valued inserts...
INSERT INTO data_archive values () () ()
Would this speed things up? Or is that just another way to do it?
Thanks,
James
---(end of broadcast)---
TIP 6:
> > there is also an index on batchid.
> >
> > The insert command is like so:
> >
> > "INSERT INTO data_archive (batchid, claimid, memberid, raw_data, status,
> > line_number) VALUES ('" + commandBatchID + "', '', '', '" + raw_data +
> > "', '1', '" + myFilter.claimLine + "');";
Also as you
James Neff wrote:
Greetings,
Ive got a java application I am reading data from a flat file and
inserting it into a table. The first 2 million rows (each file
contained about 1 million lines) went pretty fast. Less than 40 mins to
insert into the database.
After that the insert speed is sl
On Fri, 2006-12-29 at 12:39 -0500, James Neff wrote:
> Greetings,
>
> Ive got a java application I am reading data from a flat file and
> inserting it into a table. The first 2 million rows (each file
> contained about 1 million lines) went pretty fast. Less than 40 mins to
> insert into the
Greetings,
Ive got a java application I am reading data from a flat file and
inserting it into a table. The first 2 million rows (each file
contained about 1 million lines) went pretty fast. Less than 40 mins to
insert into the database.
After that the insert speed is slow. I think I may
16 matches
Mail list logo