-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

On Saturday 26 October 2002 00:29, Mike Nielsen wrote:
> The "execute" seems to be returning all the rows into the address
> space of my program.  Since the result set is ~4 million rows, this
> has diabolical consequences on memory and, in fact, the program gets
> whacked by the kernel once it has exhausted memory (and caused my
> machine to thrash wildly for quite some time...).  I've run this
> under the perl debugger, and it dies before it gets to the
> fetchrow_arrayref contained in the ParseQS module -- it never returns
> from the "execute".
>
> So, where's the bug?  If it is in my code, I'd be grateful for tips
> on the correct way to fetch big data.

It is not a bug, but a feature. If the rows were returned one by one, 
the DB connection would be extremely slow, especially so on TCP/IP 
connections.

You should do most of the work in DB side, not in Perl. If you need to 
select certain rows using a complex algorithm which is not readily 
emulated with SQL, write a procedure in PostgreSQL; it is possible to 
use Perl for stored procedures.

I beliewe RowCacheSize has no effect in PostgreSQL. The docs underline 
the fact that all data is copied to the frontend program's memory 
space.

If you need to get all those millions of lines from the table, consider 
SQL COPY command (if it suits the purpose), or use LIMIT .. OFFSET to 
fetch the data in chuncks. You'll have to use ORDER BY with LIMIT.

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.0.6 (GNU/Linux)
Comment: For info see http://www.gnupg.org

iD8DBQE9uki6nksV4Ys/z5gRAjJOAJ4qyEqsHFHFvjNe3M1sAerZamFVFwCcCMAS
zDU/YXKlQwZrDpJhnYYesL4=
=bUcs
-----END PGP SIGNATURE-----

Reply via email to