Hi

( 00.12.13 19:15 -0500 ) Matthew J. Brooks:
> I've wrote a program (before I knew much.. not that I know more now mind you
> *smirk*) that reads in 10 data files that contain student computer use
> information and creates a report for each student. The original program only
> took seconds to execute at first, but as the file sizes have grown to >25MB
> it is bringing the system it runs on to its knees because it "slurps" all of
> the data to memory.
> 
> I'm looking for ideas to solve this issue...


What about the data files? What format do they need to be in? Who
updates them? How much can you change them?

If you can put them into a db file [or files], you can tie a hash to the
db and use that. 

Or, if you get them into a RDBMS you can just query the database to get
what you need.

If the data has to stay in those files, perhaps you can add a
preprocessing stage, where each raw data file is parsed and put into a
tied db hash [or RDBMS or XML file or something else] that you can do
more efficient processing on later.

Good luck!

-- 
\js

Live as you will have wished to have lived when you are dying. 
-Christian Furchtegott Gellert 

Reply via email to