I am having a problem with a SELECT query against a large dataset.
(Several million rows.)  Because DBD::Pg pre-fetches data before
allowing me to process it, I'm using up a very large amount of RAM.
Enough that I'm concerned about running out.

Is there any way to tell it that I wish to process data incrementally
and do not want all of the data in RAM at once?

For this situation I'm looking at changing my SELECT to create a temp
table instead, which I can then use a COPY command on to pull the data
into Perl, split it, and process it.  Painfully ugly, but doable.
However if I'm going to face this situation in the future (and I am
likely to), then I'd really like a better solution.

Thanks,
Ben

Reply via email to