Re: [BackupPC-users] BackupPC failing to back up a host with big files

2012-08-28 Thread Les Mikesell
On Tue, Aug 28, 2012 at 12:09 AM, martin f krafft madd...@madduck.net wrote:
 also sprach Les Mikesell lesmikes...@gmail.com [2012.08.27.2326 +0200]:
 The only setting I can see that relates to this would be
 PartialAgeMax.  Are the retries happening before this expires?
 Otherwise you'd have to poke though the code to see why it prefers the
 previous full.

 This is a good idea, especially since I found in another log:

   full backup started for directory /; updating partial #91

 I will investigate this and get back to you.

 Why would I not want to build upon a partial backup older than
 3 days?


Not sure about the reasoning there, but I think that incomplete
backups are only kept if there are more files than the previous
partial.  If you haven't succeeded in getting a better partial in some
amount of time it might be better to throw the old one away and start
from scratch.   In the case of a fast network and a very large number
of files, the comparison will take longer than a transfer from
scratch.

The ClientTimeout setting may be your real issue though.  A running
backup should never time out as long as anything is transferring, even
if it takes days to complete but with rsync that value can be for the
whole backup.I'd try raising it a lot and trying to catch up over
a weekend.

-- 
   Les Mikesell
 lesmikes...@gmail.com

--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC failing to back up a host with big files

2012-08-28 Thread martin f krafft
also sprach Les Mikesell lesmikes...@gmail.com [2012.08.28.1455 +0200]:
 The ClientTimeout setting may be your real issue though.  A running
 backup should never time out as long as anything is transferring, even
 if it takes days to complete but with rsync that value can be for the
 whole backup.I'd try raising it a lot and trying to catch up over
 a weekend.

I have gone down this path too. I really don't like it because it
is, as you say, a safety net and I don't want to loosen it.

I don't quite understand your first comment though. The docs say
this for ClientTimeout:

  Timeout in seconds when listening for the transport program's
  (smbclient, tar etc) stdout. If no output is received during
  this time, then it is assumed that something has wedged during
  a backup, and the backup is terminated.

  Note that stdout buffering combined with huge files being
  backed up could cause longish delays in the output from
  smbclient that BackupPC_dump sees, so in rare cases you might
  want to increase this value.

  Despite the name, this parameter sets the timeout for all
  transport methods (tar, smb etc).

Especially the last sentence suggests that this also applies to
rsync and so I do not understand why an rsync backup would time out,
unless there was no more data for almost a day (72.000 seconds).

-- 
martin | http://madduck.net/ | http://two.sentenc.es/
 
the word yellow wandered through his mind in search of something to
 connect with.
 -- hitchhiker's guide to the galaxy
 
spamtraps: madduck.bo...@madduck.net


digital_signature_gpg.asc
Description: Digital signature (see	http://martin-krafft.net/gpg/sig-policy/999bbcc4/current)
--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC failing to back up a host with big files

2012-08-28 Thread Les Mikesell
On Tue, Aug 28, 2012 at 8:29 AM, martin f krafft madd...@madduck.net wrote:
 The ClientTimeout setting may be your real issue though.  A running
 backup should never time out as long as anything is transferring, even
 if it takes days to complete but with rsync that value can be for the
 whole backup.I'd try raising it a lot and trying to catch up over
 a weekend.

 I have gone down this path too. I really don't like it because it
 is, as you say, a safety net and I don't want to loosen it.

 I don't quite understand your first comment though. The docs say
 this for ClientTimeout:

   Timeout in seconds when listening for the transport program's
   (smbclient, tar etc) stdout. If no output is received during
   this time, then it is assumed that something has wedged during
   a backup, and the backup is terminated.

   Note that stdout buffering combined with huge files being
   backed up could cause longish delays in the output from
   smbclient that BackupPC_dump sees, so in rare cases you might
   want to increase this value.

   Despite the name, this parameter sets the timeout for all
   transport methods (tar, smb etc).

 Especially the last sentence suggests that this also applies to
 rsync and so I do not understand why an rsync backup would time out,
 unless there was no more data for almost a day (72.000 seconds).

Maybe it is a bug that has been fixed in the current version but I
know I have seen situations where the backup ended with an Alarm
signal but files in the backup had been updating not long before the
timeout.  I assumed it was an obscure bug in the rsync code to not
advance the timer on any activity.   If that is happening in your
case, raising the timeout at least temporarily might get past your
problem.I think the default used to be 7200 seconds so it was much
more common to see the issue.

-- 
   Les Mikesell
 lesmikes...@gmail.com

--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC failing to back up a host with big files

2012-08-28 Thread martin f krafft
also sprach Les Mikesell lesmikes...@gmail.com [2012.08.28.1650 +0200]:
 Maybe it is a bug that has been fixed in the current version but I
 know I have seen situations where the backup ended with an Alarm
 signal but files in the backup had been updating not long before the
 timeout.  I assumed it was an obscure bug in the rsync code to not
 advance the timer on any activity.

This sounds like what's happening here (3.1.0). You wouldn't happen
to have a pointer to the patch, would you? Or is it fixed in 3.2.1
(next Debian stable)? Then I would consider upgrading ahead of time…

-- 
martin | http://madduck.net/ | http://two.sentenc.es/
 
writing a book is like washing an elephant: there no good place to
 begin or end, and it's hard to keep track of what you've already
 covered.
-- anonymous
 
spamtraps: madduck.bo...@madduck.net


digital_signature_gpg.asc
Description: Digital signature (see	http://martin-krafft.net/gpg/sig-policy/999bbcc4/current)
--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC failing to back up a host with big files

2012-08-28 Thread Les Mikesell
On Tue, Aug 28, 2012 at 11:35 AM, martin f krafft madd...@madduck.net wrote:
 also sprach Les Mikesell lesmikes...@gmail.com [2012.08.28.1650 +0200]:
 Maybe it is a bug that has been fixed in the current version but I
 know I have seen situations where the backup ended with an Alarm
 signal but files in the backup had been updating not long before the
 timeout.  I assumed it was an obscure bug in the rsync code to not
 advance the timer on any activity.

 This sounds like what's happening here (3.1.0). You wouldn't happen
 to have a pointer to the patch, would you? Or is it fixed in 3.2.1
 (next Debian stable)? Then I would consider upgrading ahead of time…

No, I don't actually think it is fixed - but I'm not sure.The
default was bumped from 2 to 20 hours at some point and I've set some
of mine even higher.

-- 
   Les Mikesell
 lesmikesell#gmail.com

--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC failing to back up a host with big files

2012-08-27 Thread Les Mikesell
On Mon, Aug 27, 2012 at 4:19 AM, martin f krafft madd...@madduck.net wrote:


 However, the status quo seems broken to me. If BackupPC times out on
 a backup and stores a partial backup, it should be able to resume
 the next day. But this is not what seems to happen. The log seems to
 suggest that each backup run uses the previous full backup (#506,
 not the partial backup #507) as baseline:

   full backup started for directory / (baseline backup #506)

Rsync should resume a partial, and probably would if you did not have
any other fulls.  I'm not sure how it decides which to use as the base
in that case.

 I only just turned on verbose logging to find out what's actually
 going on and whether there is any progress being made, but to me it
 seems like BackupPC is failing to build upon the work done for the
 last partial backup and keeps trying again and again.

 Has anyone seen this behaviour and do you have any suggestions how
 to mitigate this problem?

If you are running over ssh, you can try adding the -C option for
compression if you haven't already.You could exclude some of the
new large files so a new run would complete, then include some, do
another full, and repeat until you have the whole set.  Or use brute
force: take the server to the client LAN or bring a complete clone of
the client's filesystem to the server lan and temporarily change
ClientNameAlias to point to it while you do a full backup to get the
base copy.

Or, you might try adding it under a different hostname with
ClientNameAliase pointed at the original host to see if it does reuse
the partials when there is no other choice.


-- 
   Les Mikesell
 lesmikes...@gmail.com

--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC failing to back up a host with big files

2012-08-27 Thread Mike
On 12-08-27 09:57 AM, Les Mikesell wrote:
 On Mon, Aug 27, 2012 at 4:19 AM, martin f krafft madd...@madduck.net wrote:

 However, the status quo seems broken to me. If BackupPC times out on
 a backup and stores a partial backup, it should be able to resume
 the next day. But this is not what seems to happen. The log seems to
 suggest that each backup run uses the previous full backup (#506,
 not the partial backup #507) as baseline:

full backup started for directory / (baseline backup #506)
 Rsync should resume a partial, and probably would if you did not have
 any other fulls.  I'm not sure how it decides which to use as the base
 in that case.


I've found it doesn't always do this.

The fix I've used is to --exclude a good chunk of the backup so it does 
finish in a day, then --exclude a little less the next time it runs, and 
so forth.

Unfortunately it's a slow, manual process.. but it works!

-- 
Looking for (employment|contract) work in the
Internet industry, preferrably working remotely.
Building / Supporting the net since 2400 baud was
the hot thing. Ask for a resume! ispbuil...@gmail.com


--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC failing to back up a host with big files

2012-08-27 Thread martin f krafft
also sprach Mike ispbuil...@gmail.com [2012.08.27.1502 +0200]:
 The fix I've used is to --exclude a good chunk of the backup so it does 
 finish in a day, then --exclude a little less the next time it runs, and 
 so forth.

I have used this method, which you and Les suggested. And yes, it
works. However, it's hardly a fix or even an acceptable solution
in a large-scale deployment. There is no way I can tell this to
hundreds of users who do occasionally dump larger files to their
home directories.

BackupPC must be able to cope with this, or it must learn.
Otherwise, I am afraid, it is unsuitable.

Do you see a way? How can I force it to use the last partial backup
as a baseline when attempting a new full backup?

-- 
martin | http://madduck.net/ | http://two.sentenc.es/
 
without music, life would be a mistake.
 - friedrich nietzsche
 
spamtraps: madduck.bo...@madduck.net


digital_signature_gpg.asc
Description: Digital signature (see	http://martin-krafft.net/gpg/sig-policy/999bbcc4/current)
--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC failing to back up a host with big files

2012-08-27 Thread Bryan Keadle (.net)
Thinking outloud here - could you change the sync method to SMB to get a
new full copy, then switch back to rSync?


On Mon, Aug 27, 2012 at 3:45 PM, martin f krafft madd...@madduck.netwrote:

 also sprach Mike ispbuil...@gmail.com [2012.08.27.1502 +0200]:
  The fix I've used is to --exclude a good chunk of the backup so it does
  finish in a day, then --exclude a little less the next time it runs, and
  so forth.

 I have used this method, which you and Les suggested. And yes, it
 works. However, it's hardly a fix or even an acceptable solution
 in a large-scale deployment. There is no way I can tell this to
 hundreds of users who do occasionally dump larger files to their
 home directories.

 BackupPC must be able to cope with this, or it must learn.
 Otherwise, I am afraid, it is unsuitable.

 Do you see a way? How can I force it to use the last partial backup
 as a baseline when attempting a new full backup?

 --
 martin | http://madduck.net/ | http://two.sentenc.es/

 without music, life would be a mistake.
  - friedrich nietzsche

 spamtraps: madduck.bo...@madduck.net


 --
 Live Security Virtual Conference
 Exclusive live event will cover all the ways today's security and
 threat landscape has changed and how IT managers can respond. Discussions
 will include endpoint security, mobile security and the latest in malware
 threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC failing to back up a host with big files

2012-08-27 Thread Les Mikesell
On Mon, Aug 27, 2012 at 3:45 PM, martin f krafft madd...@madduck.net wrote:
 also sprach Mike ispbuil...@gmail.com [2012.08.27.1502 +0200]:
 The fix I've used is to --exclude a good chunk of the backup so it does
 finish in a day, then --exclude a little less the next time it runs, and
 so forth.

 I have used this method, which you and Les suggested. And yes, it
 works. However, it's hardly a fix or even an acceptable solution
 in a large-scale deployment. There is no way I can tell this to
 hundreds of users who do occasionally dump larger files to their
 home directories.

The best solution would be to have enough bandwidth to meet your
requirements...   Or at least enough to catch up by running through
weekends.

 BackupPC must be able to cope with this, or it must learn.
 Otherwise, I am afraid, it is unsuitable.

 Do you see a way? How can I force it to use the last partial backup
 as a baseline when attempting a new full backup?

The only setting I can see that relates to this would be
PartialAgeMax.  Are the retries happening before this expires?
Otherwise you'd have to poke though the code to see why it prefers the
previous full.

-- 
   Les Mikesell
 lesmikes...@gmail.com

--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC failing to back up a host with big files

2012-08-27 Thread Les Mikesell
On Mon, Aug 27, 2012 at 4:26 PM, Les Mikesell lesmikes...@gmail.com wrote:
 
 The best solution would be to have enough bandwidth to meet your
 requirements...   Or at least enough to catch up by running through
 weekends.

Forgot to mention that you might have to bump ClientTimeout way up.
In theory this is supposed to be an idle timer and only quit when
there is no activity, but there are circumstances where it is taken as
the time for the whole backup run.

-- 
   Les Mikesell
lesmikes...@gmail.com

--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC failing to back up a host with big files

2012-08-27 Thread Adam Goryachev
On 08/27/2012 07:19 PM, martin f krafft wrote:
 Dear list,

 I am very happy with BackupPC, except that it hasn't been able to
 back up my workstation for more than 3 months. It started a full
 backup on 14 May 2012, shortly after I had stored a set of large
 files to disk. Since my workstation is behind an ADSL connection,
 the backup eventually timed out and was stored as a partial backup
 (#507).

 Every day since then, the full backup started again, and every day
 since then, it has been timing out.

 This is especially annoying since the exact same files have already
 been backed up to the server once from another host, so one would
 like to assume that BackupPC does not need to transfer them all over
 again.

 I understand that the rsync protocol does not allow this and
 a BackupPC client would be needed to shortcut uploading of files
 that already exist on the server.

 However, the status quo seems broken to me. If BackupPC times out on
 a backup and stores a partial backup, it should be able to resume
 the next day. But this is not what seems to happen. The log seems to
 suggest that each backup run uses the previous full backup (#506,
 not the partial backup #507) as baseline:

full backup started for directory / (baseline backup #506)

 I only just turned on verbose logging to find out what's actually
 going on and whether there is any progress being made, but to me it
 seems like BackupPC is failing to build upon the work done for the
 last partial backup and keeps trying again and again.

 Has anyone seen this behaviour and do you have any suggestions how
 to mitigate this problem?

Yes, and I've found the best way to backup large files over slow links 
is to split them into smaller files using split. However, for a random 
user who dumps a DVD image or HDD image to their computer, this is never 
going to work. This only works if you can control the system and how the 
user behaves.

Solutions:
1) Add a max size option to rsync and inform users that files larger 
than X will not be backed up (probably not a good solution if that large 
file just happened to be very important)
2) Split large files prior to backup, again not so useful if you can't 
control the user, or this isn't a standard backup file. ie, random users 
doing random things
3) Do not use backuppc to get a copy of the files across the slow link. 
Use standard rsync with options like partial to make a copy of the files 
local to the backup server, then use backuppc to backup this local store.

The second option is what I frequently use when a remote server has a 
large file (ie, MS SQL dump, or disk image, etc), so I just split the 
large file prior to backuppc doing the backup and exclude the original file.

I am seriously considering looking more into option 3, however, the 
downside of this is that backuppc will no longer be able to inform me of 
a lack of recent backups, since it will always succeed to backup the 
local copy. I would need a fairly robust system which will ensure 
regular copies of the remote system are successfully being made to the 
local system.

PS, the real problem with what you are seeing is possibly that backuppc 
isn't backing up one of the large files before being interrupted/timing 
out. Since it doesn't continue the backup from the middle of the file, 
you never complete that file, and therefore will never complete the 
backup. It would be ideal if backuppc was able to keep a partial file in 
a partial backup, so that it could continue the backup on the next run. 
Even more intelligent would be the ability to continue the backup using 
both the partial file AND the previous complete copy (if one existed) so 
that when a large file (disk image) was regularly backed up, and 
interrupted half way through (ADSL dropped sync) then it would continue 
instead of doing a full copy of the second half of the file by using the 
second half of the original file.

Regards,
Adam


--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC failing to back up a host with big files

2012-08-27 Thread backuppc
Adam Goryachev wrote at about 09:23:51 +1000 on Tuesday, August 28, 2012:
  PS, the real problem with what you are seeing is possibly that backuppc 
  isn't backing up one of the large files before being interrupted/timing 
  out. Since it doesn't continue the backup from the middle of the file, 
  you never complete that file, and therefore will never complete the 
  backup. It would be ideal if backuppc was able to keep a partial file in 
  a partial backup, so that it could continue the backup on the next run. 
  Even more intelligent would be the ability to continue the backup using 
  both the partial file AND the previous complete copy (if one existed) so 
  that when a large file (disk image) was regularly backed up, and 
  interrupted half way through (ADSL dropped sync) then it would continue 
  instead of doing a full copy of the second half of the file by using the 
  second half of the original file.

Given that the fate and timing of a new 4.x version of BackupPC is
uncertain at best and that this is a relatively common problem, maybe
we should think about how to *patch* BackupPC 3.x so that partially
backup files are stored and can be resumed from. This should never
affect more than one file per backup -- i.e., the file currently being
rsynced at the time that BackupPC fails or times out.

Upon the next backup, the built-in rsync functionality of resuming
partial copies could be used to continue the copy mid-stream. Now this
may require some changes to File-RsyncP if the resume functionality is
not built-in to the perl implementation but hopefully that wouldn't be
too difficult. It also doesn't seem like the changes to BackupPC would
be too difficult either since it already is able to recycle partial
backups at the file level, all that would be needed would be to check
for and resume any partially backed up file.

Anybody interested in pursuing this?

--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC failing to back up a host with big files

2012-08-27 Thread martin f krafft
also sprach backu...@kosowsky.org backu...@kosowsky.org [2012.08.28.0208 
+0200]:
 Given that the fate and timing of a new 4.x version of BackupPC is
 uncertain at best and that this is a relatively common problem, maybe
 we should think about how to *patch* BackupPC 3.x so that partially
 backup files are stored and can be resumed from.
[…]
 Anybody interested in pursuing this?

I could make a case to fund such a pursuit, even if only partially.

-- 
martin | http://madduck.net/ | http://two.sentenc.es/
 
minchinhampton (n.): the expression on a man's face when he has just
zipped up his trousers without due care and attention.
   -- douglas adams, the meaning of liff
 
spamtraps: madduck.bo...@madduck.net


digital_signature_gpg.asc
Description: Digital signature (see	http://martin-krafft.net/gpg/sig-policy/999bbcc4/current)
--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC failing to back up a host with big files

2012-08-27 Thread martin f krafft
also sprach Les Mikesell lesmikes...@gmail.com [2012.08.27.2326 +0200]:
 The only setting I can see that relates to this would be
 PartialAgeMax.  Are the retries happening before this expires?
 Otherwise you'd have to poke though the code to see why it prefers the
 previous full.

This is a good idea, especially since I found in another log:

  full backup started for directory /; updating partial #91

I will investigate this and get back to you.

Why would I not want to build upon a partial backup older than
3 days?

-- 
martin | http://madduck.net/ | http://two.sentenc.es/
 
vulgarity is simply the conduct of other people.
-- oscar wilde
 
spamtraps: madduck.bo...@madduck.net


digital_signature_gpg.asc
Description: Digital signature (see	http://martin-krafft.net/gpg/sig-policy/999bbcc4/current)
--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/