I add an (empty) file named '-i' in several of my key Linux
directories to prevent inadvertent rm * calamities.
However, BackupPC doesn't seem to like this, giving me error messages
of form:
Can't open /var/lib/BackupPC//pc/mycomputer/new/f%2fetc/f-i for empty
output\n
Then, when I look in
Because of limitations of my NAS, I can only run user-space NFS (unfs)
on it.
When BackupPC has TopDIR set to the mounted (u)NFS share, BackupPC
ends up silently freezing at the start of a backup.
I traced the problem to issues with symbolic links. Specifically, with
the combination of the flags
I am using the rsync-method to back up the C-drive on a WinXP machine.
I have cygwin ssh rsync installed on the remote machine.
The backup repeatedly stalls after backing up:
.file_store_32/runescape/main_file_cache.dat1
and in the middle (presumably) of backing up:
When running an incremental backup on my Linux system, BackupPC
repeatedly hangs when backing up the mythconverg mysql database for
mythtv.
Note that the database seems to be perfectly intact and I don't have
any problem 'rsyncing' it manually.
The specific error messages I get are:
Can't get
Nils Breunese (Lemonbit) wrote at about 19:50:35 +0200 on Sunday, October 19,
2008:
Jeffrey J. Kosowsky wrote:
When running an incremental backup on my Linux system, BackupPC
repeatedly hangs when backing up the mythconverg mysql database for
mythtv.
Note that the database
Holger Parplies wrote at about 22:49:28 +0200 on Sunday, October 19, 2008:
Hi,
Jeffrey J. Kosowsky wrote on 2008-10-19 14:58:15 -0400 [Re: [BackupPC-users]
Incremental dumps hanging with 'Can't get?rsync digests' 'Can't call
method isCached']:
Nils Breunese (Lemonbit) wrote
Linux Punk wrote at about 16:53:09 -0600 on Wednesday, October 22, 2008:
On Sun, Oct 19, 2008 at 8:49 AM, Jeffrey J. Kosowsky
[EMAIL PROTECTED] wrote:
I am using the rsync-method to back up the C-drive on a WinXP machine.
I have cygwin ssh rsync installed on the remote machine
I am backing up to an nfs share -- specifically, I have set
/var/lib/BackupPC to be a link to an nfs share.
I would like to check to see that the share is mounted before any
action since the machine hosting the nfs share sometimes goes down or
is disconnected from the net.
I tried using
I have googled and read a lot of posts about people having trouble
with cygwin rsync/ssh but haven't seen any definitive solutions.
For me, Backuppc repeatedly hangs after backing up the first dozen or
so files on any of my Windows machines. On any given machine, the hang
always occurs on the
Tomasz Chmielewski wrote at about 11:42:22 +0100 on Sunday, October 26, 2008:
Jeffrey J. Kosowsky schrieb:
I have googled and read a lot of posts about people having trouble
with cygwin rsync/ssh but haven't seen any definitive solutions.
For me, Backuppc repeatedly hangs after
Jeffrey J. Kosowsky wrote at about 17:04:03 -0400 on Sunday, October 19, 2008:
Holger Parplies wrote at about 22:49:28 +0200 on Sunday, October 19, 2008:
Hi,
Jeffrey J. Kosowsky wrote on 2008-10-19 14:58:15 -0400 [Re: [BackupPC-users]
Incremental dumps hanging with 'Can't get?rsync
dan wrote at about 21:39:46 -0600 on Sunday, October 26, 2008:
why done you just have the nfs mounted all the time?
Well, I would prefer that but my network is 100% reliable so nfs
and/or the various machines sometimes go down. This tends to leave nfs
with a stale mount -- so it looks like it's
dan wrote at about 21:45:25 -0600 on Sunday, October 26, 2008:
I have posted this a few times but here it is again.
Instead of installing cygwin completely, install Deltacopy instead.
1) It is rsync on windows via cygwin
2) it has a nice GUI
3) directorys available via rsync are
Holger Parplies wrote at about 03:08:47 +0100 on Monday, October 27, 2008:
Hi,
I admit to not believing in backupcentral.com for various reasons, but I
would
have expected attachments to be available even there (they apparently aren't
-
at least I can't find it either). The original
Linux Punk wrote at about 16:53:08 -0600 on Sunday, October 26, 2008:
On Sun, Oct 26, 2008 at 4:49 AM, Steen Eugen Poulsen [EMAIL PROTECTED]
wrote:
Jeffrey J. Kosowsky skrev:
I have googled and read a lot of posts about people having trouble
with cygwin rsync/ssh but haven't seen
Craig Barratt wrote at about 00:46:47 -0700 on Monday, October 27, 2008:
Jeffrey writes:
Can't call method isCached on an undefined value at
/usr/share/BackupPC/lib/BackupPC/Xfer/RsyncFileIO.pm line 165.
That isn't good. This is the case where it is doing a random check
of the
Holger Parplies wrote at about 13:57:35 +0100 on Monday, October 27, 2008:
Jeffrey J. Kosowsky wrote on 2008-10-27 01:18:24 -0400 [Re: [BackupPC-users]
Incremental dumps hanging with 'Can't get rsync digests' 'Can't call
method isCached']:
sorry about that. I experimented on tar backups
Craig Barratt wrote at about 00:46:47 -0700 on Monday, October 27, 2008:
Jeffrey writes:
Can't call method isCached on an undefined value at
/usr/share/BackupPC/lib/BackupPC/Xfer/RsyncFileIO.pm line 165.
That isn't good. This is the case where it is doing a random check
of the
Rob Owens wrote at about 11:57:01 -0400 on Monday, October 27, 2008:
Jeffrey J. Kosowsky wrote:
What is the alternative if you don't have room on your server and if
you can't afford something fancier than a SAN?
For me, using NAS is very economical given the cost of drives
Adam Goryachev wrote at about 03:10:23 +1100 on Tuesday, October 28, 2008:
Jeffrey J. Kosowsky wrote:
What is the alternative if you don't have room on your server and if
you can't afford something fancier than a SAN?
For me, using NAS is very economical given the cost of drives
I got the following while using 'rsyncd' to backup a Windows XP
machine:
I originally had full backups labeled '0' and '1'.
I then started a new *full* backup.
For several hundred out of a total of 141,000 files, I got pairs of error
messages of form:
Can't open
Nils Breunese (Lemonbit) wrote at about 01:07:02 +0100 on Tuesday, October 28,
2008:
Jeffrey J. Kosowsky wrote:
Now, the verbose output of BackupPC_dump shows:
Got remote protocol 30
Negotiated protocol version 28
while the extra extra verbose output
Jeffrey J. Kosowsky wrote at about 17:11:07 -0400 on Monday, October 27, 2008:
I got the following while using 'rsyncd' to backup a Windows XP
machine:
I originally had full backups labeled '0' and '1'.
I then started a new *full* backup.
For several hundred out of a total of 141,000
Nils Breunese (Lemonbit) wrote at about 02:04:40 +0100 on Tuesday, October 28,
2008:
Jeffrey J. Kosowsky wrote:
Nils Breunese (Lemonbit) wrote at about 01:07:02 +0100 on Tuesday,
October 28, 2008:
File::RsyncP is a (non-complete) implementation of rsync in Perl
written
I have:
$Conf{BackupFilesExclude} = [
'/Documents and Settings/*/LocalSettings/Temp/*',
'/Documents and Settings/*/Local Settings/Temporary Internet Files/*',
'/Documents and Settings/*/Local Settings/Application
Data/Microsoft/Windows/UsrClass.dat',
'/Documents and Settings/*/Local
Craig Barratt wrote at about 22:59:19 -0700 on Monday, October 27, 2008:
Jeffrey writes:
I have:
$Conf{BackupFilesExclude} = [
'/Documents and Settings/*/LocalSettings/Temp/*',
[snip]
You have two sets of quotes here, so this is excluding this path:
Thanks - I knew it would
I have a spurious backup of one of my machines that shows up on the
web interface as a backup but when you click on it you get the error:
Error: Directory /var/lib/BackupPC//pc/mymachine/0 is empty
Looking in /var/lib/BackupPC/pc/mymachine, I see that indeed there is
no 0 directory.
Indeed,
I just got a slew of such errors:
BackupPC_link got error -4 when calling
MakeFileLink(/var/lib/BackupPC//pc/mypc/5/fc/fcygwin/fusr/attrib,
530bbf3350acfd3d1ce483619f9b47d0, 1)
I traced it back to the subroutine MakeFileLink, but the documentation
only details the positive return numbers and
Craig Barratt wrote at about 00:07:51 -0700 on Thursday, October 30, 2008:
Jeffrey writes:
So what does -4 mean and what can cause it?
Fails to make a hardlink. Several possible reasons: you are out
of inodes, your cpool and and pc directory are on different file
systems, your
I have found a number of files in my pool that have the same checksum
(other than a trailing _0 or _1) and also the SAME CONTENT. Each copy
has a few links to it by the way.
Why is this happening?
Isn't this against the whole theory of pooling. It also doesn't seem
to get cleaned up by
Holger Parplies wrote at about 10:11:33 +0100 on Thursday, October 30, 2008:
Hi,
Jeffrey J. Kosowsky wrote on 2008-10-30 03:41:39 -0400 [Re: [BackupPC-users]
What does BackupPC_link got error -4 when calling MakeFileLink mean?]:
Craig Barratt wrote at about 00:07:51 -0700 on Thursday
Tino Schwarze wrote at about 11:13:27 +0100 on Thursday, October 30, 2008:
Hi Jeffrey,
On Thu, Oct 30, 2008 at 03:55:16AM -0400, Jeffrey J. Kosowsky wrote:
I have found a number of files in my pool that have the same checksum
(other than a trailing _0 or _1) and also the SAME
Holger Parplies wrote at about 11:29:49 +0100 on Thursday, October 30, 2008:
Hi,
Jeffrey J. Kosowsky wrote on 2008-10-30 03:55:16 -0400 [[BackupPC-users]
Duplicate files in pool with same CHECKSUM and same CONTENTS]:
I have found a number of files in my pool that have the same
Tino Schwarze wrote at about 15:08:29 +0100 on Thursday, October 30, 2008:
On Thu, Oct 30, 2008 at 09:56:15AM -0400, Jeffrey J. Kosowsky wrote:
I'm not sure though, how the file name is derived, I found another file
with same name but different MD5 sum:
.../cpool/0/0 # md5sum 8
Jeffrey J. Kosowsky wrote at about 10:04:26 -0400 on Thursday, October 30, 2008:
Holger Parplies wrote at about 11:29:49 +0100 on Thursday, October 30, 2008:
Hi,
Jeffrey J. Kosowsky wrote on 2008-10-30 03:55:16 -0400 [[BackupPC-users]
Duplicate files in pool with same CHECKSUM
Craig Barratt wrote at about 11:27:41 -0700 on Thursday, October 30, 2008:
Jeffrey writes:
Except that it my case some of the duplicated checksums truly are the
same file (probably due to the link issue I am having)...
Yes. Just as Holger mentions, if the hardlink attempt fails,
John Rouillard wrote at about 20:13:15 + on Thursday, October 30, 2008:
On Thu, Oct 30, 2008 at 10:04:26AM -0400, Jeffrey J. Kosowsky wrote:
Holger Parplies wrote at about 11:29:49 +0100 on Thursday, October 30,
2008:
Hi,
Jeffrey J. Kosowsky wrote on 2008-10-30 03:55
Jeffrey J. Kosowsky wrote at about 20:26:35 -0400 on Thursday, October 30, 2008:
John Rouillard wrote at about 20:13:15 + on Thursday, October 30, 2008:
On Thu, Oct 30, 2008 at 10:04:26AM -0400, Jeffrey J. Kosowsky wrote:
Holger Parplies wrote at about 11:29:49 +0100 on Thursday
I must be missing something on this whole compression, pooling, and
checksum matter.
I found 2 files in my cpool that have the same checksum (one is _0)
but 'cmp' to different values. However, when I zcat them, they have
the same value. I thought that (lossless) compression was a 1-1
mapping?
OK. I have been spending all day on this and am trying to understand
(and fix) the different types of pool duplication and corruption.
Types of Duplicate checksums:
1. Same checksum but contents differ -- INTENTIONAL - nothing to fix
2. Same checksum and compressed content
I have
Holger Parplies wrote at about 12:25:08 +0100 on Friday, October 31, 2008:
Hi,
Jeffrey J. Kosowsky wrote on 2008-10-30 23:16:05 -0400 [[BackupPC-users] 2
cpool files with same checksum, different (compressed content) but same
zcatt'ed content?]:
[...]
I found 2 files in my
Tino Schwarze wrote at about 12:20:50 +0100 on Friday, October 31, 2008:
On Thu, Oct 30, 2008 at 11:16:05PM -0400, Jeffrey J. Kosowsky wrote:
I must be missing something on this whole compression, pooling, and
checksum matter.
I found 2 files in my cpool that have the same
OK - since my nfs doesn't seem to be working, I now tried installing
BackupPC directly on my NAS device (dns-323).
When I run config.pl, I get the following error message:
Error loading BackupPC::Lib: Bareword compareLOGName not allowed
while strict subs in use at lib/BackupPC/Lib.pm
John Rouillard wrote at about 20:13:15 + on Thursday, October 30, 2008:
On Thu, Oct 30, 2008 at 10:04:26AM -0400, Jeffrey J. Kosowsky wrote:
Holger Parplies wrote at about 11:29:49 +0100 on Thursday, October 30,
2008:
Hi,
Jeffrey J. Kosowsky wrote on 2008-10-30 03:55
Les Mikesell wrote at about 10:27:20 -0500 on Friday, October 31, 2008:
Jeffrey J. Kosowsky wrote:
Is there a (reasonably easy) way of identifying which ones have the
rsync checksum seed and which ones don't???
I think you are kind of missing the point that they could both
Holger Parplies wrote at about 00:52:47 +0100 on Saturday, November 1, 2008:
Hi,
Jeffrey J. Kosowsky wrote on 2008-10-31 15:26:58 -0400 [Re: [BackupPC-users]
2 cpool files with same checksum, different (compressed content) but same
zcatt'ed content?]:
Les Mikesell wrote
Holger Parplies wrote at about 02:37:55 +0100 on Saturday, November 1, 2008:
Hi,
Jeffrey J. Kosowsky wrote on 2008-10-31 13:25:20 -0400 [[BackupPC-users]
Error installing BackupPC: - Bareword compareLOGName not allowed]:
When I run config.pl, I get the following error message
I have been considering the following:
- Uncompressing the full file to determine its length..
But this is very computationally inefficient for large files...
- Unpacking attrib file but this seems
This seems best, but I'm not sure what are the best/easiest
subroutines for
Jeffrey J. Kosowsky wrote at about 05:56:30 -0500 on Monday, November 3, 2008:
Craig Barratt wrote at about 18:06:43 -0700 on Friday, October 31, 2008:
Jeff writes:
Is there a (reasonably easy) way of identifying which ones have the
rsync checksum seed and which ones don't
Craig Barratt wrote at about 18:06:43 -0700 on Friday, October 31, 2008:
Jeff writes:
Is there a (reasonably easy) way of identifying which ones have the
rsync checksum seed and which ones don't???
I'm relucant to even say, because you are heading in an unproductive
direction.
I have seen several people asking about how to delete files from their
backups.
I am contributing this *beta* perl script for input as a potential
solution.
It allows you to remove files and or directories cleanly from one or
more backups (including adjusting the attrib file entry properly - I
A couple of caveats that I thought of after I hit the return...
1. Assuming that the logic of attrib entries (and type=10), hasn't
changed, this should be backward compatible with earlier versions of
BackupPC (I am using 3.1.0)
2. The program does *not* adjust the backupInfo files so if you
As I outlined in my earlier
message(http://sourceforge.net/mailarchive/message.php?msg_name=18693.62814.802874.426715%40consult.pretender),
it appears that some (if not a lot) of the difficulty with using
rsync/ssh with Windows is due the version 28 limitation of
File::RsyncP.
- Are there any
Heinrich Christian Peters wrote at about 23:52:26 +0100 on Thursday, November
6, 2008:
I think the changes between the protocol 28, 29 an 30 is documented here:
http://gd.tuwien.ac.at/utils/admin-tools/rsync/OLDNEWS
Very helpful but overwhelming ;)
Sounds like a lot of good improvements in
As a result of my saga with nfs problems causing broken links, I wrote
the following script for checking and fixing links
(NOTE: the problem turned out to be an interaction between nfs and
ext3 on the Linux 2.6.19 that my NAS runs. Seems to be a problem with
how directories are cached.
Looking at the code and the structure of the storage and attrib files,
it doesn't seem like there is any way for BackupPC to record and
restore hard links.
Specifically, since Backuppc uses hard links to pool files and since
the attrib database doesn't seem to record hard links, it would seem
Cody Dunne wrote at about 09:00:49 -0500 on Friday, November 7, 2008:
Jeffrey J. Kosowsky wrote:
Heinrich Christian Peters wrote at about 23:52:26 +0100 on Thursday,
November 6, 2008:
I think the changes between the protocol 28, 29 an 30 is documented
here:
http
Craig Barratt wrote at about 23:06:25 -0800 on Sunday, November 9, 2008:
Jeffrey writes:
Looking at the code and the structure of the storage and attrib files,
it doesn't seem like there is any way for BackupPC to record and
restore hard links.
Not true. Hardlinks are stored
I know that checksum caching was primarily introduced to speed up the
comparison with existing compressed pool files.
However, it seems like we get for free a built in file md4 checksum
that can be used to verify data integrity.
Is there any user-configurable way to turn on checksum caching the
Craig Barratt wrote at about 23:06:25 -0800 on Sunday, November 9, 2008:
Jeffrey writes:
Looking at the code and the structure of the storage and attrib files,
it doesn't seem like there is any way for BackupPC to record and
restore hard links.
Not true. Hardlinks are stored
Is the following true:
1. If a directory is *empty*, is there any reason for it to have an
attrib file?
Because in playing around with creating and deleting directory contents, I
found that sometimes even after emptying directory contents, the
subsequent incremental backups may
I have vastly improved and completely rewritten my program
BackupPC_deleteFiles.pl. Also many bugs were fixed ;)
The routine now allows you to delete arbitrary files and directories
(or list or globs thereof) across multiple hosts and shares, and
arbitrary (contiguous) backup ranges.
I understand why the cpool files are compressed using zlib with a
twist so that you can also save the checksums.
But why do the log files have to be in a format that can't be read
with standard unix tools? Especially since the system logs sit in
/var/log along with all the other files that get
Jeffrey J. Kosowsky wrote at about 22:51:22 -0500 on Monday, November 17, 2008:
Is the following true:
1. If a directory is *empty*, is there any reason for it to have an
attrib file?
Because in playing around with creating and deleting directory contents, I
found
dtktvu wrote at about 23:59:55 -0500 on Tuesday, November 18, 2008:
Based on the source from Kolosy
(http://www.kolosy.com/wordpress/?p=8 ), we have continued the
journey and successully ported Rsync to C# (protocol 28 ). So far,
the program has the following features:
Sounds
Was hoping to get answers to these question to confirm the algorithm I
am using on my BackupPC_deleteFile script. I just want to make sure I
am not missing any subtleties here with the attrib files.
Jeffrey J. Kosowsky wrote at about 22:51:22 -0500 on Monday, November 17, 2008:
Is the following
I often use the trick of putting (empty) -i files in my critical
directories to prevent the rm * disasters.
But, at least for me, BackupPC seems to be having problems when the
-i file is at the root of a share.
Specifically, I get the following error:
Can't open
[EMAIL PROTECTED] wrote at about 13:49:38 + on Friday, November 21, 2008:
02 I would like to conduct actions prior and post creating an archive.
Specifically mounting a Truecrypt encrypted removable disk. The following
command line works exactly as required when executed at the
For the part of my routines BackupPC_fixLinks and BackupPC_deleteFile
that actually try to make new links (or delete old ones), I would like
to make sure that nothing else is creating or deleting links to the
pool to avoid collisions.
Specifically, I would like to be able to do the following:
1.
Several people have recently asked about whether it is possible to
have some files backed up only occasionally or to keep some files for
only a limited amount of time.
One example is perhaps you want to always have the most recent couple
of backups of your temp files so that if something crashes
Shawn Austin wrote at about 13:26:02 -0500 on Sunday, November 23, 2008:
Hello all,
Recently, when I went to restore a directory, I ran into a problem where the
files were not listed in the backup interface.
I checked the attrib file for that directory with BackupPC_attribPrint and I
I am looking to backup acls for Windows so that restored files get
their acl back (I just had to restore 3 files and it was a pita to fix
their acls). Similarly, I run selinux and it would be nice to have the
extended attributes recorded to.
Since rsync can sync both extended attributes and acls,
Nicholas Hall wrote at about 15:27:48 -0600 on Monday, November 24, 2008:
On Mon, Nov 24, 2008 at 2:48 PM, Les Mikesell [EMAIL PROTECTED] wrote:
The registry is another tricky part. You may need parts relating to
what you are restoring, but if you aren't going back to exactly the same
Les Mikesell wrote at about 14:48:03 -0600 on Monday, November 24, 2008:
Jeffrey J. Kosowsky wrote:
I am looking to backup acls for Windows so that restored files get
their acl back (I just had to restore 3 files and it was a pita to fix
their acls). Similarly, I run selinux
Is there any reason that backuppc doesn't allow for files of certain
type or ending to have compression skipped?
It seems like it would be beneficial to define lists based either on
file endings and/or file types that would allow you to skip
compression for certain already compressed media types.
Nicholas Hall wrote at about 18:46:35 -0600 on Monday, November 24, 2008:
On Mon, Nov 24, 2008 at 4:41 PM, Jeffrey J. Kosowsky
[EMAIL PROTECTED]wrote:
Nicholas Hall wrote at about 15:27:48 -0600 on Monday, November 24, 2008:
On Mon, Nov 24, 2008 at 2:48 PM, Les Mikesell [EMAIL
Achim J. Latz wrote at about 17:42:17 +0100 on Tuesday, November 25, 2008:
For my company Qustodium Internet Security, I am in the process of
developing a VM appliance that will enable small (Windows-centric) networks
to enjoy the benefits of BackupPC.
Basically, the appliance runs
Does BackupPC know how to treat NTFS junction points.
They are analogous to *nix symbolic links but only work on
directories.
Based on a little test, it seems like BackupPC does not know about
them since it seems to have copied over all the data -- i.e. it
treated the junction as a real directory
Christian Völker wrote at about 22:32:19 +0100 on Saturday, November 29, 2008:
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Yohoo!
are there any plans that BackupPC supports in the near future ACLs
and extended attributes?
Ehm- why should BackupPC not support these ACLs if
Cesar Voulgaris wrote at about 15:37:55 -0300 on Tuesday, December 2, 2008:
hi all, i have this problem. I'm backing up several pcs in compressed form.
The backups are scheduled and done ok, even the aged backups apears to be
removed in the pcs specific logs file, like this:
.
The default is:
$Conf{WakeupSchedule} = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13,
14, 15, 16, 17, 18, 19, 20, 21, 22, 23];
Is there any reason midnight is left off?
-
This SF.Net email is sponsored by the Moblin Your Move
Just as an FYI, it is possible to use perl code within config files so
that you can use a single config file yet still customize
configurations by pc (or groups of pc's) without having to duplicate
changes across multiple relatively similar config files each time you
change a parameter.
For
Craig Barratt wrote at about 15:21:58 -0800 on Tuesday, December 2, 2008:
Jeffrey writes:
Just as an FYI, it is possible to use perl code within config files so
that you can use a single config file yet still customize
configurations by pc (or groups of pc's) without having to
Craig Barratt wrote at about 15:21:58 -0800 on Tuesday, December 2, 2008:
Jeffrey writes:
Just as an FYI, it is possible to use perl code within config files so
that you can use a single config file yet still customize
configurations by pc (or groups of pc's) without having to
Mark Adams wrote at about 16:39:35 -0700 on Wednesday, December 3, 2008:
Hi there, me again.
I have several machines I would like to backup with BackupPC, but the
server is a humble box with several installed hard discs -- nothing fancy.
I need to backup each client machine to a
Jeffrey J. Kosowsky wrote at about 12:03:16 -0500 on Wednesday, December 3,
2008:
Craig Barratt wrote at about 15:21:58 -0800 on Tuesday, December 2, 2008:
Jeffrey writes:
Just as an FYI, it is possible to use perl code within config files so
that you can use a single config
Nils Breunese (Lemonbit) wrote at about 23:23:26 +0100 on Friday, December 5,
2008:
Jeffrey J. Kosowsky wrote:
This may be a naive question, but I was wondering what is the state of
BackupPC development? (I couldn't find answers on the sourceforge
site)
This is the users
Johan Ehnberg wrote at about 17:49:05 +0200 on Sunday, December 7, 2008:
dan wrote:
Specifically, it seems to me that we should distinguish (at least)
among the following situations for long dump/restore times
1. Large backups/slow links - here...
2.
One of Windows backps has now failed consecutively about 20 times in
the last 2 days with the message:
Got fatal error during xfer (Child exited prematurely)
I am not getting any email messages about these failures and I only
even realized it because when running 'top' I noticed that this
Some of my WinXP backups occasionally fail with the generic LOG error
message:
12:00:30 Started incr backup on mymachine (pid=4983, share=c)
13:03:54 Backup failed on mymachine (Child exited prematurely)
and the more helful XferLOG.bad.z message:
...
Stuart Luscombe wrote at about 10:02:04 + on Monday, December 8, 2008:
Hi there,
I've been struggling with this for a little while now so I thought it about
time I got some help!
We currently have a server running BackupPC v3.1.0 which has a pool of
around 3TB and
Holger Parplies wrote at about 00:24:56 +0100 on Tuesday, December 9, 2008:
Hi,
Mark Adams wrote on 2008-12-08 14:37:54 -0700 [Re: [BackupPC-users]
BackupFilesExcludes for Linux]:
I think I spotted a mistake in my own speculative config. See below.
right, and there's a second
Holger Parplies wrote at about 04:10:17 +0100 on Tuesday, December 9, 2008:
Hi,
Jeffrey J. Kosowsky wrote on 2008-12-08 09:37:16 -0500 [Re: [BackupPC-users]
Advice on creating duplicate backup server]:
It just hit me that given the known architecture of the pool and cpool
Jeffrey J. Kosowsky wrote at about 15:50:45 -0500 on Sunday, December 7, 2008:
Some of my WinXP backups occasionally fail with the generic LOG error
message:
12:00:30 Started incr backup on mymachine (pid=4983, share=c)
13:03:54 Backup failed on mymachine (Child
Nick Smith wrote at about 11:09:37 -0500 on Tuesday, December 9, 2008:
On Tue, Dec 9, 2008 at 11:01 AM, Jeffrey J. Kosowsky
[EMAIL PROTECTED] wrote:
Jeffrey J. Kosowsky wrote at about 15:50:45 -0500 on Sunday, December 7,
2008:
Some of my WinXP backups occasionally fail
I did the first backup on a new machine but aborted it part way
through because I had the excludes wrong. It was recorded as a partial
level 0 backup.
I then ran BackupPC_dump from the command line using '-i'
(incremental) which completed without errors
The partial backup was (appropriately)
Jeffrey J. Kosowsky wrote at about 13:26:47 -0500 on Friday, December 12, 2008:
I did the first backup on a new machine but aborted it part way
through because I had the excludes wrong. It was recorded as a partial
level 0 backup.
I then ran BackupPC_dump from the command line using '-i
Jeffrey J. Kosowsky wrote at about 13:26:47 -0500 on Friday, December 12, 2008:
I did the first backup on a new machine but aborted it part way
through because I had the excludes wrong. It was recorded as a partial
level 0 backup.
I then ran BackupPC_dump from the command line using '-i
Updated version attached below:
Jeffrey J. Kosowsky wrote at about 15:52:56 -0500 on Friday, December 12, 2008:
The enclosed script (and a 1-line cmd.exe helper) cleanly and
automatically sets up shadow copies, mounts them, and launches the
rsync daemon without requiring any special
I had been thinking of writing code to implement a robust fuse
filesystem for BackupPC backups but then I saw that John Craig (and
perhaps others) had started to write code.
While the code still seems to be at the proof-of-concept I think the idea
is very powerful and extensible.
The obvious
1 - 100 of 777 matches
Mail list logo