Not directly answering your question, but ... If you're using Data::Dumper to 'serialise' a datastructure to a file so you can slurp it back in later, you might want to check out Storable.pm which is designed exactly for that purpose and will be faster/more efficient that Data::Dumper. Regards Grant ===================================================================== Grant McLean | email: [EMAIL PROTECTED] | Lvl 8, 86 Lambton Quay The Web Limited | WWW: www.web.co.nz | PO Box 15-175 Internet Solutions | Tel: (04) 495 8250 | Wellington Awesome service | Fax: (04) 495 8259 | New Zealand > -----Original Message----- > From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]] > Sent: Monday, November 27, 2000 11:09 PM > To: [EMAIL PROTECTED] > Subject: Data::Dumper and large data structures > > > I've been running into memory problems when trying to dump large data > structures. below is a trivial example that takes up all my > physical memory > (128M). > > use Data::Dumper; > my @array; for $i (0..10000) { > $array[$i] = $i; > } > print Data::Dumper->Dump( > [\@array],[qw(*array)] > ); > > I'm not very up to date with perl development. Is this a a > known problem? > Are there workarounds? Are there other packages that avoid > this problem? > > > > _______________________________________________ > ActivePerl mailing list > [EMAIL PROTECTED] > http://listserv.ActiveState.com/mailman/listinfo/activeperl > _______________________________________________ ActivePerl mailing list [EMAIL PROTECTED] http://listserv.ActiveState.com/mailman/listinfo/activeperl
