>>I have been handed a couple of requests to change some data. One is an
>>update on 3.6 million records and the other is two different deletions of
>>over 16 million rows of data. I need to be able to do these in batches as I
>>am told that Oracle will "blow up" if I do them in one shot.
>>
>>I would like to do them in increments of 50,000 then a commit...then the
>>next 50,000 and then another commit.
>>
>>What it is the best way to do this?
>>
>>Robert
>>
>>

>perhaps not strictly dbi, but I would bundle the delete statements into
>pl/sql block which means you don't get  a lot of back and forth.  From
>the top of my head, *without testing* (or even a whole lot of thought),
>it could be something like:
>
>declare
>rows_processed number;
>batch_size number := 50000;
>begin
> rows_processed := batch_size;
> while rows_processed = batch_size loop
>     delete from table where rownum < batch_size + 1;
>     rows_processed := SQL%ROWCOUNT;
>     commit;
> end loop;
>end;

This reminds one of Knuth's famous quip, "Premature optimization is
the root of all evil."

This optimization might have made some sense decades over a serial line, but
not today.  It is saving 320 single-packet round trips, which is an
infintesimal amount of traffic.

Mark

Reply via email to