On Sun, Sep 29, 2002 at 09:31:01PM -0400, Ranga Nathan wrote:
> Are there any special techniques for handling large hashes. When
> building an XML tree as a Perl data structure, at times more and more
> memory is being consumed eventually runs out of memory. Other than
> partitioning the problem is there any known technique?

Would this simple be a matter using tie() with your hash? I suppose
it depends on what data structures are in your hash.

When you say you have a 'large hash', do you mean that you have
lots of keys?  Or do you mean that you're shoving lots of big key,
values into it?  Or do you mean that it contains references to
really big data structures?

> ---Closing information gaps-----
> Ranga Nathan, Reliance Technololgy
> >>Live demo at http://any2xml.com/docs/timesheet_demo.shtml<<
> >>Get free COBOLExplorer at http://goreliance.com/download-products <<
-- 
Brian 'you Bastard' Reichert            <[EMAIL PROTECTED]>
37 Crystal Ave. #303                    Daytime number: (603) 434-6842
Derry NH 03038-1713 USA                 Intel architecture: the left-hand path
_______________________________________________
Boston-pm mailing list
[EMAIL PROTECTED]
http://mail.pm.org/mailman/listinfo/boston-pm

Reply via email to