Hello, 

I've got a table in an oracle database with approx. 100000 records, that
I'd like to put into a table in a postgresql database. (This should be
done a couple of times per week)

I have written a short perl script, on a server that has remote access
to both the oracle database as well as the postgresql database. I am
running postgresql 7.4.1 on FreeBSD.

My perl script looks something like this:

[...]
my $sth2 = $cnx2->prepare('SELECT * FROM oracle_table');
my $res2 = $sth2->execute();

while(my($field2,$field5,$field6) = ($sth2->fetchrow_array)) {
        if(defined($field2)) {
                my $sth = $cnx->prepare('INSERT INTO
the_pg_table(field1, field2) VALUES(?,?)');
                my $result = $sth->execute($field2,$field5);
                $sth->finish;

        }
}
[...]

I runs fine - and I get no errors - but it takes almost 25 minutes to
complete.. I tried running the script while just grabbing the rows from
the oracle database and writing to a text file - and then it only takes
a couple of minutes .. So it must be the INSERT command that chokes - is
there a better way to do it ? 

Any advise much appreciated.

/mich


-- 
Best Regards,
        Michael L. Hostbaek 

        */ PGP-key available upon request /*

---------------------------(end of broadcast)---------------------------
TIP 6: Have you searched our list archives?

               http://archives.postgresql.org

Reply via email to