-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

On Tuesday 15 October 2002 01:23, Philip Daggett wrote:
> I'm downloading several million records from an Oracle database to a
> MySql database and would like to use fetchall_arrayref() to do it.
> However, there are so many records that my computer memory fills up
> and then crashes.
>
> Is there a way of "chunking" the data coming down or do I need to use
> the fetch_arrayref() and do it one record at a time (several million
> times)?

What's the problem with that? If you're importing data to MySQL, the 
speed of the process should not matter much. You'll get high enough 
speed doing fetchrow_arrayref() for each row. Inserting the data to a 
database is much slower than fetching it.
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.0.6 (GNU/Linux)
Comment: For info see http://www.gnupg.org

iD8DBQE9q9n3nksV4Ys/z5gRAgvpAJwLVKQ+vNtzbmhDZRmJRio8c7tNBACeL+69
YgjNiQ9On5wN0RskFg65ByQ=
=QOS7
-----END PGP SIGNATURE-----

Reply via email to