Hi,

I just tried backup up some dirs with lots of files in it. The full backup
went ok, but on the incremental backup my machine choked (it's a test PC
with only 512 MB ram): the backupPC_dump process was taking up all available
memory (500 MB).
So my question is:
if I backup a mail storage server with the following stats for a full
backup: 1403556 files, 31818.0 MB,
what is the expected memory usage if 6000 files change for the next
incremental backup, using rsync as the protocol?

The thing is, I expect a backup server to need to have great disk IO and
network bandwith, but I don't expect it to need gigabytes of memory. Of
course memory is cheap nowadays (and the real backup server will have at
least 2GB), but that's beside the question here ...
I think, for debugging purposes, it would be great to be able to give a USR
kill signal of some kind, to get a memory usage info per variable used, so
one can see what variables are taking up all the memory. Or maybe, be able
to configure a limit for the memory used by the process (so it would maybe
take a bit longer, but take up less mem).

Franky
-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems?  Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
BackupPC-devel mailing list
BackupPC-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-devel
http://backuppc.sourceforge.net/

Reply via email to