Re: [BackupPC-users] backuppc and opensuse 10.2 clients

2007-01-10 Thread Craig Barratt
Cristian writes:

 Craig, maybe a good idea to add --specials to the default list of options? 
 (or, even better, ignore sockets altogether somehow).

It's fixed in BackupPC 3.0.0beta and is also fixed if you upgrade
to File::RsyncP 0.68.

Craig

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] BlackoutPeriod not respected?

2007-01-10 Thread Ward... James Ward
Hi,


I'm running version 2.1.1 and have the following config file for one
machine, but the BlackoutPeriod doesn't seem to be respected.  Is there
anything wrong with this?


Contents of file /var/lib/backuppc/pc/charles/config.pl, modified
2007-01-04 12:29:12

$Conf{BackupFilesExclude} 
http://aberdeen/backuppc/index.cgi?action=viewtype=docs#item_%24conf%7bbackupfilesexclude%7d
 = ['/Volumes', '/Network', '/automount'];
$Conf{RsyncClientCmd} 
http://aberdeen/backuppc/index.cgi?action=viewtype=docs#item_%24conf%7brsyncclientcmd%7d
 = '$sshPath -q -x -l backups $host nice -n 19 sudo /Users/backups/rsyncSend 
$argList+';
$Conf{BlackoutPeriods} 
http://aberdeen/backuppc/index.cgi?action=viewtype=docs#item_%24conf%7bblackoutperiods%7d
 = [
 {
 hourBegin =  7.0,
 hourEnd   = 19.5,
 weekDays  = [1, 2, 3, 4, 5],
  },
];

Thanks in advance,

James


-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] Transferred data lost

2007-01-10 Thread Yves Trudeau
Hi,
we are experimenting with backuppc et we are backuping a 30 GB share 
over Internet with rsyncd.  This morning, after more than 30 hours of 
transfer,  the remote host was accidentely rebooted so the connection 
was lost for a few minutes.  Backuppc restarted the backup very nicely 
but, the content of the new folder seems to be lost and all the files 
are transfert again.   Is it the normal behavior of backuppc?  We use 
3.0.0beta3.

Yves

-- 
Yves Trudeau, Ph. D., MCSE, OCP
Analyste Senior
Révolution Linux
819-780-8955 poste *104

Toutes les opinions et les prises de position exprimées dans ce courriel
sont celles de son auteur et ne répresentent pas nécessairement celles
de Révolution Linux

Any views and opinions expressed in this email are solely those of the author 
and do not necessarily represent those of Revolution Linux





-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Transferred data lost

2007-01-10 Thread Jason Hughes
Unfortunately, yes.

What you might want to do is put some of the larger directories in the 
BackupFilesExclude folder for that client.  Then, do a full backup.  
After that backup succeeds, remove one of the excluded folders and 
trigger another backup.  Rinse, repeat.

This way you will populate the set of files that BackupPC knows about, 
and subsequent backups will skip them quickly and move on to the 'new' 
files, transferring only those.  Once the whole machine has had all its 
files backed up once, even a slow connection is pretty dependable for 
backups.

In my experience, having a single very long backup over a slow 
connection is doomed to fail repeatedly for various reasons (loss of 
connection, reboot, network hiccup, etc).

Hope that helps,
JH

Yves Trudeau wrote:
 Hi,
 we are experimenting with backuppc et we are backuping a 30 GB share 
 over Internet with rsyncd.  This morning, after more than 30 hours of 
 transfer,  the remote host was accidentely rebooted so the connection 
 was lost for a few minutes.  Backuppc restarted the backup very nicely 
 but, the content of the new folder seems to be lost and all the files 
 are transfert again.   Is it the normal behavior of backuppc?  We use 
 3.0.0beta3.

 Yves

   

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup a host 1 time and 1 time only.

2007-01-10 Thread Jason Hughes

From the documentation:


   Other installation topics

*Removing a client*

   If there is a machine that no longer needs to be backed up (eg: a
   retired machine) you have two choices. First, you can keep the
   backups accessible and browsable, but disable all new backups.
   Alternatively, you can completely remove the client and all its backups.

   To disable backups for a client there are two special values for
   $Conf{FullPeriod}
   
http://backup.flaredev.com/cgi-bin/BackupPC_Admin?action=viewtype=docs#item_%24conf%7bfullperiod%7d
   in that client's per-PC config.pl file:

*-1*

   Don't do any regular backups on this machine. Manually requested
   backups (via the CGI interface) will still occur.

*-2*

   Don't do any backups on this machine. Manually requested backups
   (via the CGI interface) will be ignored.



Ryan Turnbull wrote:
Is there a way to have backuppc backup a system once and keep the backup 
indefinately?  I have backed up a crashed PC and restored some data from 
the backup, however I DON'T want to have BackupPC backup that host 
again.  The PC host is at the same IP address as previous.  As well I 
need to be able to view the backup within the CGI.


Please let me know if there is a way.

Thanks

Ryan

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/

  
-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup a host 1 time and 1 time only.

2007-01-10 Thread Carl Wilhelm Soderstrom
On 01/10 02:50 , Ryan Turnbull wrote:
 Is there a way to have backuppc backup a system once and keep the backup 
 indefinately?  

put this in your per-pc config file (uncommented, obviously).

# use this if you want to stop backing the box up.
#$Conf{FullPeriod} = -1;
#$Conf{EMailNotifyMinDays} = 365;


-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] webapp hinted incrementals w/ tar -T or rsync --files-from=

2007-01-10 Thread brien dieterle
We have a CMS that basically stores user data in a fs structure such as 
/users/a/b/abraham/.  Whenever a user edits one of their own files,  the 
webapp will touch a file in a specific location such as 
/activeUsers/abraham.  We use a predump script that quickly generates a 
list of recently active user's paths such as:
./users/a/b/abraham
./users/b/r/brien

With tar, the full backup is straightforward (and time-consuming).  
The incrementals, however, are fed the -T option for the filelist 
(very fast).  We quickly realized that our incrementals are not 
filled, presumably because tar is not creating the empty directory 
tree for the rest of the filesystem (backupc thinks those directories 
were deleted).  We've really thrown a monkey wrench into the backup 
semantics since  not only are the incrementals based off the last full 
(--newer), but the list (-T) will only include paths that had updates 
since the last incremental (which could be hours).

SO, now we are installing backuppc 3.0beta for the multilevel 
incrementals feature (very cool!!).  We are hoping that we can use rsync 
instead of tar so that the incrementals will be 'filled in and such 
(not to mention deleted/renamed files are detected).  One problem is 
there is no RsyncFullArgs and RsyncIncArgs with which we can trick 
backuppc the same way we trick tar.  Are we being nuts here?  If this 
even possible with Rsync?

Our end goal is to be able to take a Monthly Full, and then 
incrementals every 2 hours (360 incrementals).  By using the hints it is 
very fast. (~10 mins per backup).  Our motivation is the fact that we 
have 8 million+ files and regular tar/rsync fulls or incrementals 
literally take 15+ hours (and growing) to complete simply due to the 
traversal of all these trees and checking stats on these small files 
(generates unacceptable iowait as well).

Here is our tar config that works besides the lack of filled incs.

$Conf{TarClientCmd} = '$tarPath -c -v -f - -C /mnt/nfs'
   . ' --totals';

$Conf{TarClientRestoreCmd} = '$tarPath -x -v -f - -C /mnt/nfs'  #change 
dir to where nfs data is mounted
   . ' --totals';

$Conf{TarFullArgs} = './users';  #backup entire users folder

$Conf{TarIncrArgs} = '--newer=$incrDate -T 
/var/lib/backuppc-meta/backuplist.txt   #backup only files newer than 
last full, but ALSO only in the backuplist.txt (webapp generated)
';
$Conf{TarShareName} = '/';#this is arbitrary

$Conf{DumpPreUserCmd} = 
'/usr/local/bin/predump.sh';#generates 
backuplist.txt
$Conf{DumpPostUserCmd} = '/usr/local/bin/postdump.sh $xferOK';  
#removes backuplist.txt if the xfer was OK


Thanks for any advice!!

brien

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] recovering status.pl

2007-01-10 Thread David Koski
I made the mistake of deleting some archives manually and now my status.pl
file is zero size and backuppc will not start unless I delete status.pl
but none of my archives are visible in backuppc. How can I recover from
this?

Thank you,
David Koski
[EMAIL PROTECTED]



-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup a host 1 time and 1 time only.

2007-01-10 Thread Bradley Alexander
Can I put a slightly different spin on this question? Is there a way to earmark 
a backup not to be overwritten? In my situation, I just brought up the backuppc 
server, and now I have a good backup of everything on each box. In the future, 
I would like to only back up a subset of the directories (basically everything 
needed to reconstitute a Debian or Ubuntu box, /home /root /etc /usr/local 
/var/log /var/backups /var/cache/apt /var/cache/debconf /var/lib /boot 
/lib/modules) on a regular basis. Maybe monthly, quarterly or even 
semi-annually, I would get everything else (most of which would be binaries 
included in packages)...

Is it possible to designate either more than two levels of backups (instead of 
full and incrementals, maybe weekly/monthly/daily or something of that sort) or 
be able to set a do-not-delete flag on a specific backup (possibly with a 
comment)?

Thanks,
--b


- Original Message -
From: Carl Wilhelm Soderstrom [EMAIL PROTECTED]
To: BackupPC-users@lists.sourceforge.net
Sent: Wednesday, January 10, 2007 5:01:37 PM GMT-0500 US/Eastern
Subject: Re: [BackupPC-users] Backup a host 1 time and 1 time only.

On 01/10 02:50 , Ryan Turnbull wrote:
 Is there a way to have backuppc backup a system once and keep the backup 
 indefinately?  

put this in your per-pc config file (uncommented, obviously).

# use this if you want to stop backing the box up.
#$Conf{FullPeriod} = -1;
#$Conf{EMailNotifyMinDays} = 365;


-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Transferred data lost

2007-01-10 Thread Craig Barratt
Yves writes:

 we are experimenting with backuppc et we are backuping a 30 GB share 
 over Internet with rsyncd.  This morning, after more than 30 hours of 
 transfer,  the remote host was accidentely rebooted so the connection 
 was lost for a few minutes.  Backuppc restarted the backup very nicely 
 but, the content of the new folder seems to be lost and all the files 
 are transfert again.   Is it the normal behavior of backuppc?  We use 
 3.0.0beta3.

A partial backup is saved on failure.  With rsync and rsyncd the next
full backup essentially does an incremental (just checking the
attributes) on the files already in the partial, and copies the
other files that aren't in the partial.

So while the new directory starts off empty, all the files in
the partial that haven't changed are hardlinked without being
transferred.

Craig

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] [Fwd: [Fwd: your post re: Rsync backuppc]]

2007-01-10 Thread Craig Barratt
Tom writes:

 Is this it? To get File::RsyncP, I just did a apt-get on this:
 ii  libfile-rsyncp 0.52-1 A perl based implementation of an 
 Rsync clie

That's pretty old ( 2.5 years).  Please upgrade to 0.68.

Craig

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] recovering status.pl

2007-01-10 Thread Craig Barratt
David writes:

 I made the mistake of deleting some archives manually and now my status.pl
 file is zero size and backuppc will not start unless I delete status.pl
 but none of my archives are visible in backuppc. How can I recover from
 this?

It's ok to delete status.pl.  Over time the status will get refreshed.

It sounds like you deleted the archives file from the pc/HOST
directories.  There isn't a way to restore those.  But they
aren't used by BackupPC - they are only there for logging
purposes.

Craig

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup a host 1 time and 1 time only.

2007-01-10 Thread Craig Barratt
Bradley writes:

 Can I put a slightly different spin on this question? Is there a
 way to earmark a backup not to be overwritten? In my situation, I
 just brought up the backuppc server, and now I have a good backup
 of everything on each box. In the future, I would like to only back
 up a subset of the directories (basically everything needed to
 reconstitute a Debian or Ubuntu box, /home /root /etc /usr/local
 /var/log /var/backups /var/cache/apt /var/cache/debconf /var/lib
 /boot /lib/modules) on a regular basis. Maybe monthly, quarterly or
 even semi-annually, I would get everything else (most of which
 would be binaries included in packages)...

 Is it possible to designate either more than two levels of backups
 (instead of full and incrementals, maybe weekly/monthly/daily or
 something of that sort) or be able to set a do-not-delete flag on
 a specific backup (possibly with a comment)?

This isn't supported directly.

One approach is to create an additional virtual host in BackupPC,
so there are actually two BackupPC hosts per actual client.  One
(eg: HOST_base) would keep the single backup.  The second (eg: HOST)
would do the regular backups of a subset.  You use $Conf{ClientNameAlias}
to point both hosts to the same client.

Craig

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/