I'm working on a project that requires some rather large extractions from an
Oracle DB (about 2 million rows) and while monitoring the task manager, I
get an "Out of memory error" at about 2GB of mem usage.  

 

Client System Specifics:

 

Win 2K3 Server 32-Bit SP2

Intel Xeon 2.5Ghz 8 CPU w/4GB RAM

Activestate Perl 5.10.1

DBD-Oracle v1.21

DBI v1.607

Oracle 8i Database running on a SPARC/Solaris platform (using Oracle Client
10.2 on the windows box)

 

I've never tried handling any extractions this big before, so have several
questions:

 

Is their a 2GB address barrier in Perl?

 

Being a Windows "Server" version (which I'm not familiar with), Are there
some windows user/process settings that might be limiting the max usable
RAM?

 

Is there a built-in way to transparently "disk buffer" the returned records?
(I.E. not having to handle the buffering details myself and still be able to
use $sth->fetchall_arrayref() )

 

Any other elegant methods or suggestions for handling data extractions of
this size?

 

Once again, this is new territory for me... any suggestions or examples
greatly appreciated!

 

Thanks!

-Dan

 

---

"There are only 10 kinds of people in the world... Those that understand
binary and those that don't!"

 

Reply via email to