Dear All,

I have an issue with a CGI script which we are testing on the shared web
server of our ISP. The script has to process several hundred thousand rows
of data which it read from a database and uses to build an internal HASH of
array references.

The script works fine up to about 100,000 records but after this point it is
being terminated by the ISP's server as it is exceeding the maximum amount
of memory that may be allocated to one script - in this case they allow 6Mb
max.

What I was wondering is if there is a way (perhaps using Tie:: ?) to have
the PERL hash allocated to disk storage rather than to build it in memory?
The performance degradation would not be an issue at this stage of our
development... but is it possible to do such a thing?

I'd appreciate any ideas.

Regards,

Bill Stennett

_______________________________________________
ActivePerl mailing list
ActivePerl@listserv.ActiveState.com
To unsubscribe: http://listserv.ActiveState.com/mailman/mysubs

Reply via email to