Hi, thanks for your answer. With respect to your questions:
1. can you post the lines following 'Maximum free size required is ...' from 
the last incremental and last full backup ran.
I'm not sure where I should look for this... sorry. I've looked in /var/log and 
in the log in the backup folder, but haven't found the lines. Any help finding 
it would be appreciated.
2. how long did the backup took regularly.
Generally the full backup is done in about 20', I'd say, and the incremental in 
less that 5'.
3. can you post the approx. filesize of the 'files.snar' located in the backup 
dirs.
files.snar in a full backup weighs 1.9 Mo.
4. what are your machine specicfications (cpu, ram).
CPU details are posted here (from sudo lshw):
description: CPU
          product: AMD Athlon(tm) XP 2600+
          vendor: Advanced Micro Devices [AMD]
          physical id: 5
          bus info: c...@0
          version: 6.10.0
          slot: Socket A
          size: 1916MHz
          capacity: 3GHz
          width: 32 bits
          clock: 166MHz
and RAM details here:
*-memory
          description: System Memory
          physical id: 1d
          slot: System board or motherboard
          size: 1280MiB
          capacity: 2GiB
        *-bank:0
             description: DIMM
             product: None
             vendor: None
             physical id: 0
             serial: None
             slot: A0
             size: 1GiB
             width: 64 bits
        *-bank:1
             description: DIMM
             product: None
             vendor: None
             physical id: 1
             serial: None
             slot: A1
             size: 256MiB
             width: 64 bits
I'm backing-up to a personalized directory on a second internal hard disk 
(something sbackup didn't easily allow), in gzip format, including /home, 
excluding files >100Mo. I've scheduled a daily incremental backup with simple 
cut-off at >16 days and purging activated. I can confirm that python doesn't 
run indefinitely but does, eventually, stop. I'd like, ideally, to run a full 
backup, then a series of incrementals, then another full backup, discarding 
everything older than the most recent full backup.
A full backup directory, in my case, only weighs about 8.3Go, by the way.
Thanks again for your time and help. Best, G.R.

-- 
Merge of incremental snapshot metadata is much too slow (Python process runs 
100% CPU)
https://bugs.launchpad.net/bugs/585358
You received this bug notification because you are a member of NSsbackup
team, which is subscribed to NSsbackup.

_______________________________________________
Mailing list: https://launchpad.net/~nssbackup-team
Post to     : nssbackup-team@lists.launchpad.net
Unsubscribe : https://launchpad.net/~nssbackup-team
More help   : https://help.launchpad.net/ListHelp

Reply via email to