John Pettitt wrote:
John Pettitt wrote:

I'm getting an out of memory on large archive jobs - this in a box with 2GB of ram which makes me thing there is a memory leak someplace ...

Writing tar archive for host jpp-desktop-data, backup #150 to output file /dumpdir/jpp-desktop-data.150.tar.gz
Out of memory during "large" request for 528384 bytes, total sbrk() is 1023391744 bytes at /usr/local/backuppc/lib/BackupPC/FileZIO.pm line 202.
Executing: /bin/csh -cf /usr/local/backuppc/bin/BackupPC_tarCreate -t -h jpp-desktop-data -n 150 -s \* . | /usr/bin/gzip > /dumpdir/jpp-desktop-data.150.tar.gz


jeeves# perl -v
This is perl, v5.8.8 built for i386-freebsd-64int

The servers PID is 550, on host jeeves.localnet, version 3.1.0, started at 11/15 12:30.
Pool is 830.38GB comprising 3460758 files and 4369 directories (as of 11/25 14:38),


The host in question was 76GB and it contained several multi GB files.

Anybody got any ideas?
More info - it seems to be triggered by restoring lots of already compressed files (in the two cases I've seen one was an iTunes library and the other was a photo archive of camera raw compressed .tif files).

I can reproduce it from the command line - the perl instance grows to 363 MB stays there for a while then grows to 1GB (dlimit on this box) and fails.   Other restores with similar numbers of files run to completion and it stay around 40-50mb.


Well some more digging and a recompile of perl set to use system malloc() instead of it's build in malloc() and the problem has gone away (usemymalloc=n).    Something the restore process does really causes perl malloc to go crazy on lage restore jobs.   Playing  with the write buffer size changed how the problem manifested (both smaller - 65536 and larger - 2^24 - 16MB buffers made it better but didn't eliminate it).    The 1MB default buffer seemed to be the worst.

There is a comment in the FreeBSD perl ports makefile about malloc having problems with threads but perl was not built with threads on my box as far as I know and BackupPC does not use threads.  

I just managed to get an archive of a 375GB backup set so I'm now much happier.

John
-------------------------------------------------------------------------
SF.Net email is sponsored by: The Future of Linux Business White Paper
from Novell.  From the desktop to the data center, Linux is going
mainstream.  Let it simplify your IT future.
http://altfarm.mediaplex.com/ad/ck/8857-50307-18918-4
_______________________________________________
BackupPC-users mailing list
[email protected]
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

Reply via email to