How about using, 

$dbh->{RowCacheSize} = (check documentation to see options).
# This is like Oracle's pre-fetch.

$sth = $dbh->prepare($sql);
$sth->execute;
$sth->bind_columns(\($var1, $var2, ...));

while ($sth->fetch) {
   # put in Mysql
}

**Note : make sure your DBD::Oracle is current(1.12), otherwise you will
         have to put a '$sth->{NAME};' before the execute, a bug fix for
         caching.  It was fixed prior to 1.12, but I forget which version
         so best to have the current version.


On 14-Oct-2002 Philip Daggett wrote:
> I'm downloading several million records from an Oracle database to a MySql 
> database and would like to use fetchall_arrayref() to do it. However, there 
> are so many records that my computer memory fills up and then crashes.
> 
> Is there a way of "chunking" the data coming down or do I need to use the 
> fetch_arrayref() and do it one record at a time (several million times)?
> 
> Thanks,
> 
> Phil
> 

----------------------------------
E-Mail: Scott T. Hildreth <[EMAIL PROTECTED]>
Date: 14-Oct-2002
Time: 20:27:57
----------------------------------

Reply via email to