Well, actually there is.
Do the processing in a plperlu function which uses it's own connection to the 
db. Then every instance of the function will have it's own transaction.
Try to start that perl connection outside the function or your performance will 
drop too much.
I use this technique to fetch, process and store records from an oracle db 
(erp) in a postgresql db (datawarehousing/reporting/external data).
It gets me some 500,000 records in little over 50 minutes.
Probably real perl addicts would have a lot to say about my coding, but being a 
perl illiterate (does one write it like this?) I must say that I'm quite happy 
with the result.
All I used was the perl chapter in the postgresql manual (big hurray for the 
manual), perl express for testing and debugging of a lot of very 
basic things like hashes (yes, illiterate !-) and EMS SQL Manager.
Good luck

>>> "A. Kretschmer" <[EMAIL PROTECTED]> 2008-04-07 15:01 >>>
am  Mon, dem 07.04.2008, um 14:46:50 +0200 mailte [EMAIL PROTECTED] folgendes:
> Hi,
> I have to execute commit for evey record that i processed during a cursor 
> fetch in a function.
> There is a way to do it?

No.


Regards, Andreas
-- 
Andreas Kretschmer
Kontakt:  Heynitz: 035242/47150,   D1: 0160/7141639 (mehr: -> Header)
GnuPG-ID:   0x3FFF606C, privat 0x7F4584DA   http://wwwkeys.de.pgp.net ( 
http://wwwkeys.de.pgp.net/ )

-- 
Sent via pgsql-sql mailing list (pgsql-sql@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-sql

Reply via email to