Re: [BackupPC-users] ssh+rsync and known_hosts

2023-08-27 Thread Norbert Hoeller via BackupPC-users




>On 7/22/23 11:59 AM, Kenneth Porter wrote:
>> I'm setting up some Raspberry Pis and I set up BackupPC to back them up 
>> using ssh+rsync. I installed the key in ~backuppc/.ssh/authorized_keys but 
>> the initial backup was still failing. So I tried manually ssh'ing into the 
>> Pi and discovered I was hitting the question to add the Pi to known_hosts. I 
>> don't see this mentioned in the documentation. I'm not sure where it would 
>> even go, but I wanted to mention it as I'll likely forget this a year from 
>> now.
>
>

I have learned from past experience to login to the backuppc user and SSH to 
the remote client using the client host name from the client configuration 
file.  This ensures I have everything set up properly and also adds the client 
to known_hosts.  


___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:https://github.com/backuppc/backuppc/wiki
Project: https://backuppc.github.io/backuppc/


Re: [BackupPC-users] Deleting backups

2022-08-30 Thread Norbert Hoeller via BackupPC-users
After excluding /home/backuppc from the localhome backup, I did a successful 
incremental backup of /home on the 24th.  However, I noticed that the file 
server was very slow on the morning of the 25th and a number of overnight 
backups were still running.  I suspect BackupPC_refCountUpdate was the culprit, 
driving high disk seek rates.  I ended up renaming 
/var/lib/backuppc/pc/localhome and started a new localhome backup series.

I am running 4.3.0 (manual build for Serbian on ARM).  I have held off 
upgrading until an official build but it appears I need to upgrade to 64bit 
Bullseye which is not trivial. 

On August 23, 2022 11:38:53 AM EDT, Norbert Hoeller via BackupPC-users 
 wrote:
>I have a home file server that also runs backuppc for a number of other 
>servers. Backuppc backs up /home (host localhome) on the file server so I can 
>recover from accidentally deleted files.
>
>Originally, /var/lib/backuppc was mapped to its own partition which 
>periodically caused space issues. I recently moved the backuppc data folder to 
>/home/backuppc and mapped it to /var/lib/backuppc but forgot to exclude the 
>/home/backuppc folder from backups of /home. Backuppc completed backup 325 and 
>part of 326 before I noticed the problem. I excluded /home/backuppc from 
>further backups and did a manual incremental backup which created 327 (partial 
>backup 326 disappeared).  
>
>Although the cpool size did not increase, pc/localhome is over 2GB bigger. I 
>tried running "BackupPC_Delete -h localhome -n 327 -s home backuppc" in the 
>hope that would clean up the pointers but it merged #327/home/backuppc -> 
>#325/home/backuppc.  I repeated the process with backup 325 which merged 
>#325/home/backuppc -> #324/home/backuppc.  Right now, pc/localhome/324 takes 
>up 644MB while pc/localhome/327 takes up 1015MB, compared to under 15MB before.
>
>Is there a way to get rid of the unnecessary pointers to /home/backuppc or do 
>I just wait for them to age out?___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:https://github.com/backuppc/backuppc/wiki
Project: https://backuppc.github.io/backuppc/


[BackupPC-users] Deleting backups

2022-08-23 Thread Norbert Hoeller via BackupPC-users
I have a home file server that also runs backuppc for a number of other 
servers. Backuppc backs up /home (host localhome) on the file server so I can 
recover from accidentally deleted files.

Originally, /var/lib/backuppc was mapped to its own partition which 
periodically caused space issues. I recently moved the backuppc data folder to 
/home/backuppc and mapped it to /var/lib/backuppc but forgot to exclude the 
/home/backuppc folder from backups of /home. Backuppc completed backup 325 and 
part of 326 before I noticed the problem. I excluded /home/backuppc from 
further backups and did a manual incremental backup which created 327 (partial 
backup 326 disappeared).  

Although the cpool size did not increase, pc/localhome is over 2GB bigger. I 
tried running "BackupPC_Delete -h localhome -n 327 -s home backuppc" in the 
hope that would clean up the pointers but it merged #327/home/backuppc -> 
#325/home/backuppc.  I repeated the process with backup 325 which merged 
#325/home/backuppc -> #324/home/backuppc.  Right now, pc/localhome/324 takes up 
644MB while pc/localhome/327 takes up 1015MB, compared to under 15MB before.

Is there a way to get rid of the unnecessary pointers to /home/backuppc or do I 
just wait for them to age out?___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:https://github.com/backuppc/backuppc/wiki
Project: https://backuppc.github.io/backuppc/


Re: [BackupPC-users] Backup Data Volumes

2010-07-18 Thread Norbert Hoeller
John, Craig identified and fixed  a problem in File::RsyncP on ARM 
processors having to do with whether characters are considered signed or 
unsigned. 

I did stumble on another problem that I will post to the mailing list 
shortly.

I scanned the mailing list but did not see the email that you mention 
below.
Regards, Norbert
 




 From: John Rouillard rouilj-backu...@re... - 2010-06-30 18:47

 Well perhaps not. I posted an earlier email where I am tranferring a
 lot of file data for old files that are in prior level 0 backups and
 are in the cpool.



--
This SF.net email is sponsored by Sprint
What will you do first with EVO, the first 4G phone?
Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] High Backup Data Volumes After Re-adding an Excluded Directory

2010-07-18 Thread Norbert Hoeller
While trying to diagnose the high backuppc data volumes issue posted to 
the mailing list on June 14th, I had excluded a directory structure 
containing about 140MB of data.  I removed the exclude once Craig had 
provided a fix for File::RsyncP and noticed that backup volumes jumped by 
about 150MB. Tracing suggested that all the files in the previously 
excluded directory structure were being backed up on every incremental 
backup, even though the content of the files was unchanged (the first 
incremental backup after the directory was added indicated that backuppc 
had found the file in the backup pool).

Although the contents of the files had not changed, I had 'touch'ed the 
files during the period where  the directory structure has been excluded 
so that Google Sitemap would index them .  It seems that the backuppc 
incremental backup got confused and repeatedly selected the files for 
backup even though the file date was no longer changing. 

File::RsyncP/rsync should have determined that the contents of the files 
were identical to the pool copy.  Verbose logging suggests that checksums 
were exchanged, but rsync did nothing with them (the remote system 
reported false_alarms=0 hash_hits=0 matches=0).  The reason is not clear. 
I had enabled checksum caching at one point but disabling checksum caching 
it did not change the symptoms.

The problem was 'fixed' by doing a full backup.  It appears that this 
caused rsync to properly compare checksums and backuppc updated the file 
date - the next incremental backup did not check the files that previously 
had been copied in full.  I 'touch'ed one of the files and verified that 
the next incremental backup checked the file but rsync found no changed 
blocks.--
This SF.net email is sponsored by Sprint
What will you do first with EVO, the first 4G phone?
Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup Data Volumes

2010-06-30 Thread Norbert Hoeller
The high backuppc data volumes appear to be a problem with File::RsyncP on 
the Ubuntu port to the ARM architecture.  I have created a small test 
script that calls File:RsyncP to copy files from one directory to another 
on the same system.  Running the script the first time copies all the 
files (about 17MB).  On Linux/Intel, subsequent runs transfer only control 
information.  On Linux/ARM, a large number of the blocks are flagged as 
different even though the files themselves are identical.  As a result, 
7.5MB of the file data is transferred.

I have send the results of the test to Craig Barratt and also posted to 
the ubuntu-devel-discuss mailing list.
 --
This SF.net email is sponsored by Sprint
What will you do first with EVO, the first 4G phone?
Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup Data Volumes

2010-06-14 Thread Norbert Hoeller
I discontinued backup of my old web server this weekend and upgraded rsync 
on the new web server to 3.0.5 to be compatible with the backuppc server. 
This morning, backup traffic was close to 450MB.  I did one full backup 
(existing files 1492/14MB, new files 12/0MB) and three incrementals 
(existing files 3826/411MB, new files 778/27MB). 

The traffic pattern suggests that one of the incremental backups (existing 
files 3734/411MB, new files 664/21MB) accounted for the bulk of the 
traffic.  I had migrated multiple MediaWiki instances over the weekend, 
all using an identical code base.  One MediaWiki instance had been backed 
up last week.  Although the file counts and aggregate data is 
considerable, I would have expected rsync to detect that the files had 
already been stored in the backuppc server and would not have transferred 
the files.  The data volumes would suggest otherwise.

Am I missing something obvious?
Thanks, Norbert

--
ThinkGeek and WIRED's GeekDad team up for the Ultimate 
GeekDad Father's Day Giveaway. ONE MASSIVE PRIZE to the 
lucky parental unit.  See the prize list and enter to win: 
http://p.sf.net/sfu/thinkgeek-promo___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup Data Volumes

2010-06-14 Thread Norbert Hoeller
Below are the rsync options - I do not recall making any changes from the 
defaults.

rsync --server --sender --numeric-ids --perms --owner --group -D --links 
--hard-links --times --block-size=2048 --recursive . /var/symlink/

Aside from backing up a symbolic link rather than the full (and rather 
long) directory path that I used on the old web server, another difference 
is the new backup server architecture - it is a 'plug computer' with an 
ARM processor running Ubuntu 9.04.
Thanks, Norbert

--
ThinkGeek and WIRED's GeekDad team up for the Ultimate 
GeekDad Father's Day Giveaway. ONE MASSIVE PRIZE to the 
lucky parental unit.  See the prize list and enter to win: 
http://p.sf.net/sfu/thinkgeek-promo___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Backup Data Volumes

2010-06-12 Thread Norbert Hoeller
I have been using backuppc 2.1.2 for a number of years to back up a Linux 
web server (rsync 2.6.9) to a local server (also running rsync 2.6.9).  My 
recollection was that the amount of data transferred by backuppc was quite 
low (around 40MB) regardless of whether I was doing a full backup or an 
incremental backup. 

In late February, I built a new backup server running backuppc 3.1.0 and 
rsync 3.0.5.  Over the last month, I have been migrating the Linux web 
server from a shared hosting environment to a virtual private server 
running rsync 2.6.8.  I recently have been tracking bandwidth due to ISP 
caps.  This morning's backups appear to have download about 230MB to the 
local server and uploaded about 10MB. 

Going over the logs, I see 4 incremental backups totalling 116 existing 
files (3MB) and 207 new files (24MB).  I started one full backup on the 
old shared hosting server that failed after 18 minutes with Aborting 
backup up after signal PIPE.  A partial dump was saved.  Another attempt 
was made the next hour and completed after 4 minutes with 1752 existing 
files (123MB) and 7 new files (6MB). 

Clearly, a lot of things have changed, including a large increase in the 
number of system files that backuppc needs to check.  How much of the 
240MB of backup traffic could be attributed to transferring control 
information?  My reading the documentation suggests that rsync only 
transfers changed blocks even on a full backup, so that data portion of 
the backup should be at most 30GB.Could the mismatch in rsync levels 
be causing issues?  The VPS server that I am backing up is running CentOS 
and 2.6.8 is the most current rsync available on the standard 
repositories.
 
Any suggestions would be greatly appreciated!
 Thanks, Norbert--
ThinkGeek and WIRED's GeekDad team up for the Ultimate 
GeekDad Father's Day Giveaway. ONE MASSIVE PRIZE to the 
lucky parental unit.  See the prize list and enter to win: 
http://p.sf.net/sfu/thinkgeek-promo___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Trick for Restoring Drupal Website via tar File

2009-09-28 Thread Norbert Hoeller
I tested restoring a Drupal website by having backuppc generate a tar 
file, uploading the file to the server and then extracting the tar file to 
the new Drupal directory structure.  A large number of files were not 
restored because a number of Drupal sub-directories are read-only.  Errors 
included 'Cannot open: Permission denied' and 'Cannot open: No such file 
or directory'. 

I found a reference to the '--delay-directory-restore' option at 
http://www.gnu.org/software/tar/manual/tar.html#SEC77 that solved this 
problem. --
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backing Up Symbolic Links Using Rsync

2007-06-27 Thread Norbert Hoeller
 What are your rsync options?
 I'd be curious to see the XferLOG file with $Conf{XferLogLevel} set to 
4.

Craig, the backup command is: $Conf{RsyncClientCmd} = '$sshPath -q -x -l 
userid $host $rsyncPath $argList+'; 

Clearly, I am blind (:-).  The symlinks are being backed up, although not 
flagged as directories.  Backuppc also does not recurse into the target 
directory.  Here is an extract from the XferLOG for a normal directory:

attribSet(dir=f.%2fbioeducation, file=images)
  create d 755 10102/6004096 images
attribSet(dir=f.%2fbioeducation, file=images)

and a symbolically linked directory:

attribSet(dir=f.%2fbioeducation, file=maintenance)
  pool   l 777 10102/600  77 maintenance
attribSet(dir=f.%2fbioeducation, file=maintenance)

I will do a restore of the common files and directories that are the 
targets of the symlinks and then restore the symlinks themselves - that 
will be the true test. 
Thanks! Norbert

PS.  Thanks for the fast response on the mixed-case host configuration 
files.  It did throw me for a loop for a bit, until I noticed that the 
host-specific section of the GUI did not show any configuration file.
-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] Backing Up Symbolic Links Using Rsync

2007-06-26 Thread Norbert Hoeller
I am backing up a directory structure on a Linux server running rsync 
version 2.5.6cvs  protocol version 26.  Most of the files and 
subdirectories are symbolic links to a common 'source' directory 
structure.  It appears that backuppc is backing up the symbolic links to 
files, but the XferLOG shows no indication that any of the symbolic links 
to directories are being processed.  So far, I have done a full and 
incremental backup.  Is this something I should worry about if I need to 
do a restore?  I am backing up the 'source' directory separately, so I can 
always re-establish the links manually.

I am running backuppc 2-1-2 on an Ubuntu 6.10 server - waiting for a 
package to be available before upgrading. 
Thanks, Norbert-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] Backuppc Does Not Like Mixed-Case Host Config File?

2007-06-26 Thread Norbert Hoeller
I created a new share on an existing web server that I am backing up.  The 
host configuration file was called '1and1-MW-common.pl', consistent with 
the case of the directory that I was backing up.  I had added 
'1and1-MW-common 0 user' to the backuppc 'hosts' file.  The backups were 
not starting, supposedly because of slow PING times, even though I had set 
'$Conf{PingMaxMsec} = 1000;' in the configuration file.  To make a long 
story short, it appears that backuppc converted the host name in the 
'hosts' file to lower case, and was looking for (but did not find) 
'1and1-mw-common.pl'.  I renamed the host configuration file and 
everything is sunny once again.

I am running backuppc 2-1-2 on an Ubuntu 6.10 server - waiting for a 
package to be available before upgrading. 
Regards, Norbert-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] BackupPC 2.1.2-5 Reporting XferErrs on Successful Local Restore

2007-01-20 Thread Norbert Hoeller
I installed BackupPC 2.1.2-5 on an Ubuntu 6.10 server system for local 
backups.  The only tailoring I needed to do was:
* defined the directories to be backed up
* modified $Conf{TarClientCmd} = '/usr/bin/sudo $tarPath -c -v -f - -C 
$shareName+'
. ' --totals';
* added $Conf{TarClientRestoreCmd} = '/usr/bin/sudo $tarPath -x -v -f - -C 
$shareName+'
. ' --totals';

Full and incremental backups appear to be working fine.  I can 
successfully restore files, except BackupPC reports '#xferErrs=1. 
Restore# 
Result 
Start Date
Dur/mins
#files 
MB 
#tar errs 
#xferErrs 
5 
success 
1/20 14:52 
0.0 
13 
10.6 
0 
1 

Error log contains no indication of any problems. 
Contents of file /var/lib/backuppc/pc/localhost/RestoreLOG.5, modified 
2007-01-20 14:52:11 (Extracting only Errors) 
Running: /usr/bin/sudo /bin/tar -x -v -f - -C /home --totals
Running: /usr/share/backuppc/bin/BackupPC_tarCreate -h localhost -n 9 -s 
/home -t -r /user -p /user2/ /user
Xfer PIDs are now 6128,6129
tarCreate: Done: 13 files, 11138482 bytes, 2 dirs, 0 specials, 0 errors
Total bytes read: 11151360 (11MiB, 7.9MiB/s)

So far, I have not found which part of the code thinks there is an error 
in the restore.

Although a minor problem, any suggestions for getting ride of this issue 
would be appreciated! 
Thanks, Norbert-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC 2.1.2-5 Reporting XferErrs on Successful Local Restore

2007-01-20 Thread Norbert Hoeller
Craig, fix to Tar.pm worked like a charm! 
Thanks, Norbert

PS.  Great application!  Does everything I want it to do, with very little 
effort on my part.  I successfully tested out archiving today as a means 
of creating monthly offline backups.  Next step is backing up Windows 
workstations.
  

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/