[BackupPC-users] Backuppc authentication mail

2011-02-22 Thread egrimisu
Hi, as i was realy busy in the past year i haven't try your config till now, 
and i wanted to let you know that everything worked flawlessly. Thnaks again.
Misu

+--
|This was sent by egrim...@yahoo.com via Backup Central.
|Forward SPAM to ab...@backupcentral.com.
+--



--
Index, Search  Analyze Logs and other IT data in Real-Time with Splunk 
Collect, index and harness all the fast moving IT data generated by your 
applications, servers and devices whether physical, virtual or in the cloud.
Deliver compliance at lower cost and gain new business insights. 
Free Software Download: http://p.sf.net/sfu/splunk-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] I can't seem to re-install BackupPC after messing around with mountpoints

2011-02-22 Thread Carl Wilhelm Soderstrom
On 02/21 06:16 , Dennis Blewett wrote:
 2011-02-21 18:11:04 Can't create a test hardlink between a file in
 /var/lib/backuppc/pc and /var/lib/backuppc/cpool.  Either these are
 different file systems, or this file system doesn't support hardlinks, or
 these directories don't exist, or there is a permissions problem, or the
 file system is out of inodes or full.  Use df, df -i, and ls -ld to check
 each of these possibilities. Quitting...

Glad to know you got this solved; but did you try making a hardlink on
that filesystem yourself to see if that was indeed the problem?

-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

--
Index, Search  Analyze Logs and other IT data in Real-Time with Splunk 
Collect, index and harness all the fast moving IT data generated by your 
applications, servers and devices whether physical, virtual or in the cloud.
Deliver compliance at lower cost and gain new business insights. 
Free Software Download: http://p.sf.net/sfu/splunk-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Trying to get everything under one foldertree

2011-02-22 Thread Carl Wilhelm Soderstrom
On 02/21 10:03 , Dennis Blewett wrote:
 I'm using the web interface with localhost and tar as the Xfer method.
 
 Let's say I have these folders:
 
 /home/workstation/Desktop
 /home/workstation/Documents
 /office
 /research
 
 And I want all of those to be listed in /.
 
 So, when I look at the virtual tree in the web interface it has...
 
 + /
 - /home/workstation/Desktop
 - /home/workstation/Documents
 - /office
 - /research
 
 Kind of the idea where everything is listed under a / tree.
 I can't seem to get it to look that way.


I'm not sure what you're doing; especially since I don't use the web GUI.

Can you paste the configuration file itself?

It sounds like you may be backing up 4 separate 'shares' rather than 4
subdirectories under '/'. 

Perhaps look into BackupFilesOnly and BackupFilesExclude? (i.e. back up '/'
and then exclude what you don't want, or list the items you want to back
up).


-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

--
Index, Search  Analyze Logs and other IT data in Real-Time with Splunk 
Collect, index and harness all the fast moving IT data generated by your 
applications, servers and devices whether physical, virtual or in the cloud.
Deliver compliance at lower cost and gain new business insights. 
Free Software Download: http://p.sf.net/sfu/splunk-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup lasts forever rsync with linux

2011-02-22 Thread Les Mikesell
On 2/22/2011 9:17 AM, Rob Morin wrote:
 So i have this server that started its backup at 10pm Yesterday. The
 logs below say it finished @ 1:52, however on the status page it still
 shows that its running and that it started at 5:50 am, my server has a
 high load with a PID that accompanies the backup process for the server
 being backed up.

 What going on?

After the transfer completes, the link job runs, which is what you are 
seeing.  This processes any files that did not match the previous full 
run by compressing them and linking into the pool.

-- 
   Les Mikesell
lesmikes...@gmail.com



--
Index, Search  Analyze Logs and other IT data in Real-Time with Splunk 
Collect, index and harness all the fast moving IT data generated by your 
applications, servers and devices whether physical, virtual or in the cloud.
Deliver compliance at lower cost and gain new business insights. 
Free Software Download: http://p.sf.net/sfu/splunk-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup lasts forever rsync with linux

2011-02-22 Thread Mark Maciolek
hi,

Right from the documentation:

For each complete, good, backup, BackupPC_link is run. To avoid race 
conditions as new files are linked into the pool area, only a single 
BackupPC_link program runs at a time and the rest are queued.
BackupPC_link reads the NewFileList written by BackupPC_dump and 
inspects each new file in the backup. It re-checks if there is a 
matching file in the pool (another BackupPC_link could have added the 
file since BackupPC_dump checked). If so, the file is removed and 
replaced by a hard link to the existing file. If the file is new, a hard 
link to the file is made in the pool area, so that this file is 
available for checking against each new file and new backup.

Then, for incremental backups, hard links are made in the new backup to 
all files that were not extracted during the incremental backups. The 
means the incremental dump looks like a complete image of the PC (with 
the exception that files that were removed on the PC since the last full 
dump will still appear in the backup directory tree).



On 2/22/2011 10:17 AM, Rob Morin wrote:
 So i have this server that started its backup at 10pm Yesterday. The
 logs below say it finished @ 1:52, however on the status page it still
 shows that its running and that it started at 5:50 am, my server has a
 high load with a PID that accompanies the backup process for the server
 being backed up.

 What going on?

 Log file :

 2011-02-21 22:00:00 full backup started for directory /etc

 2011-02-21 22:01:41 full backup started for directory /home

 2011-02-22 01:35:02 full backup started for directory /usr/local/src

 2011-02-22 01:37:25 full backup started for directory
 /var/lib/mysql/mysql_backup

 2011-02-22 01:52:58 full backup 0 complete, 999677 files, 57653209583
 bytes, 0 xferErrs (0 bad files, 0 bad shares, 0 other)

 Status:

 Currently Running Jobs

 Host

   

 Type

   

 User

   

 Start Time

   

 Command

   

 PID

   

 Xfer PID

 k1.interhub.local
 https://mail.6948065.com/backuppc/index.cgi?host=k1.interhub.local

   

 full

   

 root mailto:root

   

 2/22 05:50

   

 BackupPC_link k1.interhub.local

   

 24726

   

 Ps on backupPC server:

 6431 ? S 1:40 /usr/bin/perl /usr/share/backuppc/bin/BackupPC -d

 6433 ? S 0:33 /usr/bin/perl /usr/share/backuppc/bin/BackupPC_trashClean

 17849 pts/0 S+ 0:00 grep -i back

 24726 ? D 2:50 /usr/bin/perl /usr/share/backuppc/bin/BackupPC_link
 k1.interhub.local

 Top of backupPC server:

 top - 10:16:39 up 21:04, 2 users, load average: 3.97, 4.54, 4.70

 Tasks: 221 total, 1 running, 220 sleeping, 0 stopped, 0 zombie

 Cpu(s): 0.0%us, 0.0%sy, 0.0%ni, 76.8%id, 23.1%wa, 0.0%hi, 0.0%si, 0.0%st

 Mem: 8193608k total, 8109764k used, 83844k free, 3347896k buffers

 Swap: 98052080k total, 640k used, 98051440k free, 1880360k cached

 Rob Morin

 Systems Administrator

 Infinity Labs Inc.

 (514) 387-0638 Ext: 207

 ilabs-email-sig



 --
 Index, Search  Analyze Logs and other IT data in Real-Time with Splunk
 Collect, index and harness all the fast moving IT data generated by your
 applications, servers and devices whether physical, virtual or in the cloud.
 Deliver compliance at lower cost and gain new business insights.
 Free Software Download: http://p.sf.net/sfu/splunk-dev2dev



 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/

-- 
Mark Maciolek
Network Administrator
Morse Hall 339
862-3050
mark.macio...@unh.edu
https://www.sr.unh.edu

--
Index, Search  Analyze Logs and other IT data in Real-Time with Splunk 
Collect, index and harness all the fast moving IT data generated by your 
applications, servers and devices whether physical, virtual or in the cloud.
Deliver compliance at lower cost and gain new business insights. 
Free Software Download: http://p.sf.net/sfu/splunk-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] I can't seem to re-install BackupPC after messing around with mountpoints

2011-02-22 Thread Anton Dollmaier
Hi,


 Glad to know you got this solved; but did you try making a hardlink on
 that filesystem yourself to see if that was indeed the problem?

On Debian, simply re-installing the backuppc-package does not recreate 
folders in /var/lib/backuppc, like pc, cpool, pool etc.


You'll have to remove/purge and then install again - like Dennis already 
did.

So, no need for testing the hardlinks - the needed directories didn't 
even exist at all.


I ran into the same issue once.


best regards,

Anton

--
Index, Search  Analyze Logs and other IT data in Real-Time with Splunk 
Collect, index and harness all the fast moving IT data generated by your 
applications, servers and devices whether physical, virtual or in the cloud.
Deliver compliance at lower cost and gain new business insights. 
Free Software Download: http://p.sf.net/sfu/splunk-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup lasts forever rsync with linux

2011-02-22 Thread Rob Morin
Ok thanks, I had another first full backup run and it took 14 hours to
complete, it was under 125 gigs...

Thanks for the info Mark  Les



Rob Morin
Systems Administrator
Infinity Labs Inc.
(514) 387-0638 Ext: 207



-Original Message-
From: Mark Maciolek [mailto:macio...@unh.edu] 
Sent: Tuesday, February 22, 2011 10:23 AM
To: backuppc-users@lists.sourceforge.net
Subject: Re: [BackupPC-users] Backup lasts forever rsync with linux

hi,

Right from the documentation:

For each complete, good, backup, BackupPC_link is run. To avoid race 
conditions as new files are linked into the pool area, only a single 
BackupPC_link program runs at a time and the rest are queued.
BackupPC_link reads the NewFileList written by BackupPC_dump and 
inspects each new file in the backup. It re-checks if there is a 
matching file in the pool (another BackupPC_link could have added the 
file since BackupPC_dump checked). If so, the file is removed and 
replaced by a hard link to the existing file. If the file is new, a hard 
link to the file is made in the pool area, so that this file is 
available for checking against each new file and new backup.

Then, for incremental backups, hard links are made in the new backup to 
all files that were not extracted during the incremental backups. The 
means the incremental dump looks like a complete image of the PC (with 
the exception that files that were removed on the PC since the last full 
dump will still appear in the backup directory tree).



On 2/22/2011 10:17 AM, Rob Morin wrote:
 So i have this server that started its backup at 10pm Yesterday. The
 logs below say it finished @ 1:52, however on the status page it still
 shows that its running and that it started at 5:50 am, my server has a
 high load with a PID that accompanies the backup process for the server
 being backed up.

 What going on?

 Log file :

 2011-02-21 22:00:00 full backup started for directory /etc

 2011-02-21 22:01:41 full backup started for directory /home

 2011-02-22 01:35:02 full backup started for directory /usr/local/src

 2011-02-22 01:37:25 full backup started for directory
 /var/lib/mysql/mysql_backup

 2011-02-22 01:52:58 full backup 0 complete, 999677 files, 57653209583
 bytes, 0 xferErrs (0 bad files, 0 bad shares, 0 other)

 Status:

 Currently Running Jobs

 Host

   

 Type

   

 User

   

 Start Time

   

 Command

   

 PID

   

 Xfer PID

 k1.interhub.local
 https://mail.6948065.com/backuppc/index.cgi?host=k1.interhub.local

   

 full

   

 root mailto:root

   

 2/22 05:50

   

 BackupPC_link k1.interhub.local

   

 24726

   

 Ps on backupPC server:

 6431 ? S 1:40 /usr/bin/perl /usr/share/backuppc/bin/BackupPC -d

 6433 ? S 0:33 /usr/bin/perl /usr/share/backuppc/bin/BackupPC_trashClean

 17849 pts/0 S+ 0:00 grep -i back

 24726 ? D 2:50 /usr/bin/perl /usr/share/backuppc/bin/BackupPC_link
 k1.interhub.local

 Top of backupPC server:

 top - 10:16:39 up 21:04, 2 users, load average: 3.97, 4.54, 4.70

 Tasks: 221 total, 1 running, 220 sleeping, 0 stopped, 0 zombie

 Cpu(s): 0.0%us, 0.0%sy, 0.0%ni, 76.8%id, 23.1%wa, 0.0%hi, 0.0%si, 0.0%st

 Mem: 8193608k total, 8109764k used, 83844k free, 3347896k buffers

 Swap: 98052080k total, 640k used, 98051440k free, 1880360k cached

 Rob Morin

 Systems Administrator

 Infinity Labs Inc.

 (514) 387-0638 Ext: 207

 ilabs-email-sig





--
 Index, Search  Analyze Logs and other IT data in Real-Time with Splunk
 Collect, index and harness all the fast moving IT data generated by your
 applications, servers and devices whether physical, virtual or in the
cloud.
 Deliver compliance at lower cost and gain new business insights.
 Free Software Download: http://p.sf.net/sfu/splunk-dev2dev



 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/

-- 
Mark Maciolek
Network Administrator
Morse Hall 339
862-3050
mark.macio...@unh.edu
https://www.sr.unh.edu


--
Index, Search  Analyze Logs and other IT data in Real-Time with Splunk 
Collect, index and harness all the fast moving IT data generated by your 
applications, servers and devices whether physical, virtual or in the cloud.
Deliver compliance at lower cost and gain new business insights. 
Free Software Download: http://p.sf.net/sfu/splunk-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/



Re: [BackupPC-users] Backup backuppc pool with rsync offsite

2011-02-22 Thread gregwm
 rsync'ing the BackupPC data pool is generally recommended against. The
 number of hardlinks causes an explosive growth in memory consumption by
 rsync and while you may be able to get away with it if you have 20GB of
 data (depending on how much memory you have); you will likely run out of
 memory when your amount of data gets larger.

this issue sure comes up alot, and perhaps i should just keep quiet
since i personally am in no position to do it or even go off looking
for an rsync forum, nor do i have any knowledge of just how convoluted
the rsync source may be to try to look at, but as a naive outsider it
seems still it ought not to be such a task to have a go at the rsync
source and come out with a version that sorted its xferlist into
[filesystem:inode] order if preserving hardlinks, or possibly just
created a simple [filesystem:inode] index of files already transfered,
in replacement of whatever mangy hardlink table is in there now.

--
Free Software Download: Index, Search  Analyze Logs and other IT data in 
Real-Time with Splunk. Collect, index and harness all the fast moving IT data 
generated by your applications, servers and devices whether physical, virtual
or in the cloud. Deliver compliance at lower cost and gain new business 
insights. http://p.sf.net/sfu/splunk-dev2dev 
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup backuppc pool with rsync offsite

2011-02-22 Thread Timothy J Massey
gregwm backuppc-us...@whitleymott.net wrote on 02/22/2011 11:26:51 AM:

 this issue sure comes up alot, and perhaps i should just keep quiet
 since i personally am in no position to do it or even go off looking
 for an rsync forum, nor do i have any knowledge of just how convoluted
 the rsync source may be to try to look at, but as a naive outsider it
 seems still it ought not to be such a task to have a go at the rsync
 source and come out with a version that sorted its xferlist into
 [filesystem:inode] order if preserving hardlinks, or possibly just
 created a simple [filesystem:inode] index of files already transfered,
 in replacement of whatever mangy hardlink table is in there now.

The software is not far off from doing that.  Now multiply that by a 
billion files

The reason why some people claim it works and some people have it fail (or 
cancel it tens of hours into the process) is related to how many files are 
in the pool and how much RAM they have.  For systems with 8GB of RAM each, 
and with a few hundred thousand files, it'll most likely work.  For 
systems with 1GB and a billion files, forget it.  They thrash terribly, 
and the time it would take to finish might be measured in *months*.

Tim Massey

 
Out of the Box Solutions, Inc. 
Creative IT Solutions Made Simple!
http://www.OutOfTheBoxSolutions.com
tmas...@obscorp.com 
 
22108 Harper Ave.
St. Clair Shores, MI 48080
Office: (800)750-4OBS (4627)
Cell: (586)945-8796 
--
Free Software Download: Index, Search  Analyze Logs and other IT data in 
Real-Time with Splunk. Collect, index and harness all the fast moving IT data 
generated by your applications, servers and devices whether physical, virtual
or in the cloud. Deliver compliance at lower cost and gain new business 
insights. http://p.sf.net/sfu/splunk-dev2dev ___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup backuppc pool with rsync offsite

2011-02-22 Thread Jeffrey J. Kosowsky
gregwm wrote at about 10:26:51 -0600 on Tuesday, February 22, 2011:
   rsync'ing the BackupPC data pool is generally recommended against. The
   number of hardlinks causes an explosive growth in memory consumption by
   rsync and while you may be able to get away with it if you have 20GB of
   data (depending on how much memory you have); you will likely run out of
   memory when your amount of data gets larger.
  
  this issue sure comes up alot, and perhaps i should just keep quiet
  since i personally am in no position to do it or even go off looking
  for an rsync forum, nor do i have any knowledge of just how convoluted
  the rsync source may be to try to look at, but as a naive outsider it
  seems still it ought not to be such a task to have a go at the rsync
  source and come out with a version that sorted its xferlist into
  [filesystem:inode] order if preserving hardlinks, or possibly just
  created a simple [filesystem:inode] index of files already transfered,
  in replacement of whatever mangy hardlink table is in there now.
  

Well I wrote a perl script that does something quite similar:
- Creates a pool indexed by inode number
- Creates a list of hard links
- Rsync the pool itself (without -H)
- Recreate the hard links

Search the archives in the past few weeks for BackupPC_copyPcPool.pl

--
Free Software Download: Index, Search  Analyze Logs and other IT data in 
Real-Time with Splunk. Collect, index and harness all the fast moving IT data 
generated by your applications, servers and devices whether physical, virtual
or in the cloud. Deliver compliance at lower cost and gain new business 
insights. http://p.sf.net/sfu/splunk-dev2dev 
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Trying to get everything under one foldertree

2011-02-22 Thread Bowie Bailey
On 2/21/2011 11:03 PM, Dennis Blewett wrote:
 I'm using the web interface with localhost and tar as the Xfer method.

 Let's say I have these folders:

 /home/workstation/Desktop
 /home/workstation/Documents
 /office
 /research

 And I want all of those to be listed in /.

 So, when I look at the virtual tree in the web interface it has...

 + /
 - /home/workstation/Desktop
 - /home/workstation/Documents
 - /office
 - /research

 Kind of the idea where everything is listed under a / tree.
 I can't seem to get it to look that way.

 How do I get it that way?

This question doesn't make sense.

Please restate your question and provide the config for this backup so
we can see what you have done.

-- 
Bowie

--
Free Software Download: Index, Search  Analyze Logs and other IT data in 
Real-Time with Splunk. Collect, index and harness all the fast moving IT data 
generated by your applications, servers and devices whether physical, virtual
or in the cloud. Deliver compliance at lower cost and gain new business 
insights. http://p.sf.net/sfu/splunk-dev2dev 
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup backuppc pool with rsync offsite

2011-02-22 Thread Dennis Blewett
13,849 items, totalling 3.8 GB

It would appear that I have a feasible number of files. I'm not sure how
many more files I will have by the end of April, though.

I've read about that rsync -H would be a practical command to use on the
backuppc folder.

What I'm also curious about is if I should be rsyncing any other files, thus
allowing me to restore from the offsite backup in the case I lose everything
and rebuild a backuppc configuration: I would attempt to rsync back to my
computer with the new backuppc configuration and attempt to restore/recover
said files.

On Tue, Feb 22, 2011 at 9:41 AM, Carl Wilhelm Soderstrom 
chr...@real-time.com wrote:

 On 02/21 11:00 , Dennis Blewett wrote:
  Will I come across many problems in later restoring the pool's data if I
  just rsync /var/lib/backuppc to the server?
  Are there other files and folders I should be rsync'ing to the server?

 rsync'ing the BackupPC data pool is generally recommended against. The
 number of hardlinks causes an explosive growth in memory consumption by
 rsync and while you may be able to get away with it if you have 20GB of
 data (depending on how much memory you have); you will likely run out of
 memory when your amount of data gets larger.

 future versions of backuppc may deal with this problem better.

 There are ways of replicating the backuppc data pool remotely; but I have
 not tried them myself.

 --
 Carl Soderstrom
 Systems Administrator
 Real-Time Enterprises
 www.real-time.com


 --
 Index, Search  Analyze Logs and other IT data in Real-Time with Splunk
 Collect, index and harness all the fast moving IT data generated by your
 applications, servers and devices whether physical, virtual or in the
 cloud.
 Deliver compliance at lower cost and gain new business insights.
 Free Software Download: http://p.sf.net/sfu/splunk-dev2dev
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/

--
Free Software Download: Index, Search  Analyze Logs and other IT data in 
Real-Time with Splunk. Collect, index and harness all the fast moving IT data 
generated by your applications, servers and devices whether physical, virtual
or in the cloud. Deliver compliance at lower cost and gain new business 
insights. http://p.sf.net/sfu/splunk-dev2dev ___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup backuppc pool with rsync offsite

2011-02-22 Thread Timothy J Massey
Dennis Blewett dennis.blew...@gmail.com wrote on 02/22/2011 10:17:29 PM:

 13,849 items, totalling 3.8 GB
 
 It would appear that I have a feasible number of files. I'm not sure
 how many more files I will have by the end of April, though.
 
 I've read about that rsync -H would be a practical command to use 
 on the backuppc folder.
 
 What I'm also curious about is if I should be rsyncing any other 
 files, thus allowing me to restore from the offsite backup in the 
 case I lose everything and rebuild a backuppc configuration: I would
 attempt to rsync back to my computer with the new backuppc 
 configuration and attempt to restore/recover said files.

Please Google this and read the results:  this question has been asked 
dozens of times, and always boils down to the same points.  I will sum it 
up *briefly*:

* For most people, rsync does not work to replicate a backup server 
effectively.  Period.  I think *no* one would suggest this as a reliable 
ongoing method of replicating a BackupPC server.  Ever.

* The best methods for this boil down to two camps:
1) Run two BackupPC servers and have both back up the hosts 
directly
No replication at all:  it just works.
2) Use some sort of block-based method of replicating the data

* Block-based replication boils down to two methods
1) Use md or dm to create a RAID-1 array and rotate members of 
this array in and out
2) Use LVM to create snapshots of partitions and dd the partition 
to a different drive
(I guess 3) Stop BackupPC long enough to do a dd of the partition 
*without* lVM)


You may have a dozen different comments on why you want to use rsync, how 
much better and easier it would be, why a block-based clone doesn't do 
what you want, how cumbersome and annoying it would be, etc. etc.  That is 
*all* well and good, and nearly all of us would agree with you--except for 
the fact that an rsync of the pool just doesn't work in the vast majority 
of cases.

Tim Massey

 
Out of the Box Solutions, Inc. 
Creative IT Solutions Made Simple!
http://www.OutOfTheBoxSolutions.com
tmas...@obscorp.com 
 
22108 Harper Ave.
St. Clair Shores, MI 48080
Office: (800)750-4OBS (4627)
Cell: (586)945-8796 


--
Free Software Download: Index, Search  Analyze Logs and other IT data in 
Real-Time with Splunk. Collect, index and harness all the fast moving IT data 
generated by your applications, servers and devices whether physical, virtual
or in the cloud. Deliver compliance at lower cost and gain new business 
insights. http://p.sf.net/sfu/splunk-dev2dev ___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/