On Thu, 19 Nov 2009 01:07:25 -0700, Dan Fish wrote:
> I'm working on a project that requires some rather large extractions
> from an Oracle DB (about 2 million rows) and while monitoring the task
> manager, I get an "Out of memory error" at about 2GB of mem usage.
[...]
> Is there a built-in way to transparently "disk buffer" the returned
> records? (I.E. not having to handle the buffering details myself and
> still be able to use $sth->fetchall_arrayref() )

Well there's yer problem... you want to slurp the whole database into 
memory, you're asking for trouble.

I've done a lot of DBD::Oracle programming for over 10 years and never met 
a situation where I needed to do that.  I can safely say that your program 
will be better off for using some sort of while ( ...->fetch ) loop.

If you think that would change the structure of youe peogram unacceptably, 
you need to learn about closures and iterators.  Get Mark-Jason Dominus' 
book "Higher Order Perl" if you want to learn all about those.

-- 
Peter Scott
http://www.perlmedic.com/
http://www.perldebugged.com/
http://www.informit.com/store/product.aspx?isbn=0137001274

-- 
To unsubscribe, e-mail: beginners-unsubscr...@perl.org
For additional commands, e-mail: beginners-h...@perl.org
http://learn.perl.org/


Reply via email to