Yes, that's a thought. However, we may need to write out the hash
elements to a file as XML. Thanks for the tip.. 

---Closing information gaps-----
Ranga Nathan, Reliance Technololgy
>>Live demo at http://any2xml.com/docs/timesheet_demo.shtml<<
>>Get free COBOLExplorer at http://goreliance.com/download-products <<

> -----Original Message-----
> From: Mark Aisenberg [mailto:[EMAIL PROTECTED]] 
> Sent: Monday, September 30, 2002 10:36 AM
> To: [EMAIL PROTECTED]
> Subject: RE: [Boston.pm] Large hashes
> 
> 
> If you can trade speed for memory, perhaps a tied hash would 
> work for you.
> 
> 
> 
> 
> -----Original Message-----
> From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]]
> On Behalf Of Ranga Nathan
> Sent: Sunday, September 29, 2002 9:31 PM
> To: [EMAIL PROTECTED]
> Subject: [Boston.pm] Large hashes
> 
> 
> Are there any special techniques for handling large hashes. 
> When building an XML tree as a Perl data structure, at times 
> more and more memory is being consumed eventually runs out of 
> memory. Other than partitioning the problem is there any 
> known technique?
> 
> The problem occurs when we use Any2XML.pm that uses Ximple.pm 
> which builds XML tree as an array of hashes recursively. 
> While this works well for most documents, when parsing flat 
> files containing thousands of records, this breaks down.
> 
> ---Closing information gaps-----
> Ranga Nathan, Reliance Technololgy
> >>Live demo at http://any2xml.com/docs/timesheet_demo.shtml<<
> >>Get free COBOLExplorer at http://goreliance.com/download-products <<
> 
> _______________________________________________
> Boston-pm mailing list
> [EMAIL PROTECTED] http://mail.pm.org/mailman/listinfo/boston-pm
> 
> 
> 
> 

_______________________________________________
Boston-pm mailing list
[EMAIL PROTECTED]
http://mail.pm.org/mailman/listinfo/boston-pm

Reply via email to