I modified the rsync code to limit restore to a share:
against (# Version 3.0.0, released 28 Jan 2007)
(root) diff /usr/local/BackupPC/lib/BackupPC/Xfer/Rsync.pm
/usr/local/BackupPC/lib/BackupPC/Xfer/Rsync.pm.orig
134,149d133
## AQUEOS debut
if( defined
Little change in the way you define it:
Use
$Conf{rsyncRestoreLimitToShare} = ('/var','/home');
and not
$Conf{rsyncRestoreLimitToShare} = ['/var','/home'];
i do not know the difference but it do not work with []
--
Cordialement,
Ghislain
smime.p7s
Description: S/MIME Cryptographic Signature
Robin Lee Powell wrote:
I've been using my own scripts
http://digitalkingdom.org/~rlpowell/hobbies/backups.html to remotely
mirror backuppc's date in an encrypted fashion.
The problem is, the time rsync takes seems to keep growing. I
expect this to continue more-or-less without bound, and
Hello and thanks for a nice program. I like the design of backuppc there
fore I decided to try it out.
Backups aren't any good unless it's possible to extract data from it some
late night when everything goes bad. I have no problem to restore from the
user interface. But how do I do if I just
Mark Best wrote:
http://www.pcc-services.com/sles/backuppc.html
Step-by-Step guide to BackupPC on SLES.
I have my own notes for getting BackupPC to work on SUSE 10.3 if you
need them.
Hi Mark, thanks for your reply.
I've already been following your SLES guide, and after a lot of
buggering
Johan J wrote:
Hello and thanks for a nice program. I like the design of backuppc
there fore I decided to try it out.
Backups aren't any good unless it's possible to extract data from it
some late night when everything goes bad. I have no problem to
restore from the user interface.
Hello,
I'm a newbie too, so I won't go in deep, and take my answer with some grain of
salt :
about the tar :
To extract files from a Tar file
Extracts all files from a compressed Tar file of the name archive.tar.gz.
tar -xzf archive.tar.gz
To extract to a specific folder, use:
tar -xzf
Op 6 feb 2008, om 11:33 heeft Gilles Guiot het volgende geschreven:
I'm a newbie too, so I won't go in deep, and take my answer with
some grain of salt :
about the tar :
To extract files from a Tar fileExtracts all files from a compressed
Tar file of the name archive.tar.gz.
tar -xzf
Hello,
I'm using the latest backuppc version, on a debianserver.
I selected a few shares to backup on a linux server. Everything goes fine. When
I browse the backup, it appears all shares except those I selected were backed.
The shares I selected BackupFiles only were not backupped, whether i
Hi!
Just a small problem while using BackupPC :
We have more mails nightly sent to the users than what would accept the
EMailNotifyMinDays var. In fact, every night one mail is sent.
I realised that those mails are not mentioned in the Email Summary (on the
cgi).
To me, it looks like :
- the
We've run into a problem when running backuppc on Windows XP hosts where
many files simply aren't being backed up. For instance if I've got a
directory with MP3s in it the entire directory tree will be created and
viewable in the backuppc web gui, but the bottom level directory will be
empty.
On Wed, Feb 06, 2008 at 10:20:46AM +0100, Nils Breunese (Lemonbit)
wrote:
It is generally believed on this list (I believe) that it's not
feasible to use something as 'high-level' as rsync to replicate
BackupPC's pool. The amount of memory needed by rsync will just
explode because of all the
Robin Lee Powell wrote:
On Wed, Feb 06, 2008 at 10:20:46AM +0100, Nils Breunese (Lemonbit)
wrote:
It is generally believed on this list (I believe) that it's not
feasible to use something as 'high-level' as rsync to replicate
BackupPC's pool. The amount of memory needed by rsync will just
On Wed, Feb 06, 2008 at 11:22:10AM -0600, Les Mikesell wrote:
Robin Lee Powell wrote:
On Wed, Feb 06, 2008 at 10:20:46AM +0100, Nils Breunese
(Lemonbit) wrote:
It is generally believed on this list (I believe) that it's not
feasible to use something as 'high-level' as rsync to replicate
On Wed, Feb 06, 2008 at 09:33:47AM -0800, Robin Lee Powell wrote:
On Wed, Feb 06, 2008 at 11:22:10AM -0600, Les Mikesell wrote:
Robin Lee Powell wrote:
On Wed, Feb 06, 2008 at 10:20:46AM +0100, Nils Breunese
(Lemonbit) wrote:
It is generally believed on this list (I believe) that it's
On 02/06 09:39 , Robin Lee Powell wrote:
My backuppc pool and pc directories together have 2442024 files, and
10325584 KiB of data.
If I'm reading that correctly, that's only about 10GB of data.
I once tryed syncing 100GB of backuppc pool data from one disk to another,
on the same machine,
Robin Lee Powell wrote:
It is generally believed on this list (I believe) that it's not
feasible to use something as 'high-level' as rsync to replicate
BackupPC's pool. The amount of memory needed by rsync will just
explode because of all the hardlinks. Usually people have been
using
On Wed, Feb 06, 2008 at 12:03:03PM -0600, Carl Wilhelm Soderstrom
wrote:
On 02/06 09:39 , Robin Lee Powell wrote:
My backuppc pool and pc directories together have 2442024
files, and 10325584 KiB of data.
If I'm reading that correctly, that's only about 10GB of data.
Yes, but lots of
On Wed, Feb 06, 2008 at 09:33:47AM -0800, Robin Lee Powell wrote:
This reminds me: is there some fundamental reason backuppc can't
use symlinks? It would make so many things like this *so* much
easier. It such a great package otherwise; this is the only thing
that's given me cause to be
Robin Lee Powell [EMAIL PROTECTED] wrote:
On Wed, Feb 06, 2008 at 09:33:47AM -0800, Robin Lee Powell wrote:
This reminds me: is there some fundamental reason backuppc can't
use symlinks? It would make so many things like this *so* much
easier. It such a great package otherwise; this is
On Wed, Feb 06, 2008 at 02:20:09PM -0500, Paul Fox wrote:
Robin Lee Powell [EMAIL PROTECTED] wrote:
On Wed, Feb 06, 2008 at 09:33:47AM -0800, Robin Lee Powell
wrote:
This reminds me: is there some fundamental reason backuppc
can't use symlinks? It would make so many things like
Robin Lee Powell wrote:
This reminds me: is there some fundamental reason backuppc
can't use symlinks? It would make so many things like this
*so* much easier. It such a great package otherwise; this is
the only thing that's given me cause to be annoyed with it.
Still
On Wed, Feb 06, 2008 at 02:03:04PM -0600, Les Mikesell wrote:
Robin Lee Powell wrote:
This reminds me: is there some fundamental reason backuppc
can't use symlinks? It would make so many things like
this *so* much easier. It such a great package otherwise;
this is the only
On 02/06 11:14 , Robin Lee Powell wrote:
On Wed, Feb 06, 2008 at 12:03:03PM -0600, Carl Wilhelm Soderstrom
wrote:
On 02/06 09:39 , Robin Lee Powell wrote:
My backuppc pool and pc directories together have 2442024
files, and 10325584 KiB of data.
If I'm reading that correctly,
When I get some complete answer like that , i cant wait to try it !
I see your point about the software raid. I think I will switch to it.
On my perc 2 I think I have 48M of chache on it.
The only think I need to read about, is that LVM. Does anyone here have a
godd link about this
On Wed, Feb 06, 2008 at 02:11:16PM -0600, Carl Wilhelm Soderstrom
wrote:
On 02/06 11:14 , Robin Lee Powell wrote:
On Wed, Feb 06, 2008 at 12:03:03PM -0600, Carl Wilhelm
Soderstrom wrote:
On 02/06 09:39 , Robin Lee Powell wrote:
My backuppc pool and pc directories together have 2442024
On 02/06 12:20 , Robin Lee Powell wrote:
Yes, but are you trying to maintain a remote sync over a DSL line?
:D
no; because I have all those files. :)
If I have an offsite backup server; I just have it do backups completely in
parallel with the onsite backup server. This gives much more
In the past, I've done my remote mirrors of my backuppc backups one
of two ways:
1. Run tarCreate or whatever, and create giant tarballs of the
things I've backed up. In the past, this has been totally
inappropriate for remote mirroring, because encrypting the file
would kill rsync's ability
On Wed, Feb 06, 2008 at 02:35:21PM -0600, Carl Wilhelm Soderstrom
wrote:
On 02/06 12:20 , Robin Lee Powell wrote:
Yes, but are you trying to maintain a remote sync over a DSL
line? :D
no; because I have all those files. :)
If I have an offsite backup server; I just have it do backups
ADNET Ghislain wrote:
I modified the rsync code to limit restore to a share:
against (# Version 3.0.0, released 28 Jan 2007)
(root) diff /usr/local/BackupPC/lib/BackupPC/Xfer/Rsync.pm
/usr/local/BackupPC/lib/BackupPC/Xfer/Rsync.pm.orig
134,149d133
## AQUEOS debut
if(
Robin Lee Powell wrote:
Hardlinking is an atomic operation, tied to the inode of the
filesystem so once established the target identity can't be
confused. Symlinks are just re-evaluated as filenames when you
open them.
Yes, I understand that.
So, if you created a symlink to a pooled
I attempted to include extended attributes, and found that it results in
the error message fileListReceive failed. I also found an old message
in this list, about having to use --devices instead of -D. I think the
problem is that BackupPC's Perl Rsync client is an incomplete
implementation. It
On Wed, Feb 06, 2008 at 03:19:32PM -0600, Les Mikesell wrote:
Robin Lee Powell wrote:
Again: not being able to reasonbly mirror the backup system is a
Real Problem; do you have other any ideas as to how to fix it?
I do it locally with raid1 mirroring and physically rotate the
drives
On Wed, Feb 06, 2008 at 03:19:32PM -0600, Les Mikesell wrote:
Or, if you want a local copy too and don't want to burden the
target with 2 runs, just do a straight uncompressed rsync copy
locally, then let your remote backuppc run against that to save
your compressed history on an encrypted
Robin Lee Powell wrote:
On Wed, Feb 06, 2008 at 03:19:32PM -0600, Les Mikesell wrote:
Or, if you want a local copy too and don't want to burden the
target with 2 runs, just do a straight uncompressed rsync copy
locally, then let your remote backuppc run against that to save
your compressed
I just ran a yum update on my Fedora 7 machine, and among all the other
updates was one for BackupPC. My BackupPC package is now
BackupPC-3.0.0-3.fc7 I might have had BackupPC-2.something beforehand).
The problem is that the machine that was my Archive machine now is being
treated as a normal
Michael Mansour wrote:
Hi,
Ever since I updated to the latest Linux kernel (Use Scientific Linux 4.5), I
get these same backup errors everyday on each of the SL4.5 servers:
Xfer PIDs are now 26468,26747
[ skipped 20459 lines ]
Thanks Gilles and Nils for fast reply!
Nils your answer was the one I was looking for even if I was hoping on a
solution (like Gilles answer) with a standard file format. I guess the
developers have their reasons for doing it like this. But at least the files
are stored one by one :-) so one
38 matches
Mail list logo