John Pettitt wrote:

I'm getting an out of memory on large archive jobs - this in a box with 2GB of ram which makes me thing there is a memory leak someplace ...

Writing tar archive for host jpp-desktop-data, backup #150 to output file /dumpdir/jpp-desktop-data.150.tar.gz
Out of memory during "large" request for 528384 bytes, total sbrk() is 1023391744 bytes at /usr/local/backuppc/lib/BackupPC/FileZIO.pm line 202.
Executing: /bin/csh -cf /usr/local/backuppc/bin/BackupPC_tarCreate -t -h jpp-desktop-data -n 150 -s \* . | /usr/bin/gzip > /dumpdir/jpp-desktop-data.150.tar.gz


jeeves# perl -v
This is perl, v5.8.8 built for i386-freebsd-64int

The servers PID is 550, on host jeeves.localnet, version 3.1.0, started at 11/15 12:30.
Pool is 830.38GB comprising 3460758 files and 4369 directories (as of 11/25 14:38),


The host in question was 76GB and it contained several multi GB files.

Anybody got any ideas?
More info - it seems to be triggered by restoring lots of already compressed files (in the two cases I've seen one was an iTunes library and the other was a photo archive of camera raw compressed .tif files).

I can reproduce it from the command line - the perl instance grows to 363 MB stays there for a while then grows to 1GB (dlimit on this box) and fails.   Other restores with similar numbers of files run to completion and it stay around 40-50mb.

John
-------------------------------------------------------------------------
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2005.
http://clk.atdmt.com/MRT/go/vse0120000070mrt/direct/01/
_______________________________________________
BackupPC-users mailing list
[email protected]
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

Reply via email to