Re: [BackupPC-users] Restore issues

2009-08-05 Thread Vetch
Hi Matthias,
Sorry it took me so long to get back to you - I've had a lot of tight
deadlines.
I'm not backing up a windows client... It's a Linux server offering a samba
share.
To be fair, thinking about it, since I'm using rsync, it's just a Linux
server, and the fact that I access it with Windows clients is irrelevant.
No VSS - because Linux...
Good news on the backup front...
Still, very strange about the different numbers of files reported on Windows
when looking at the share, but I guess that's one of those strange glitchy
things with MS products...
I think pretty much the only thing I backup is the /home, /etc, /usr and
/var...
Thanks for your help on this...
Cheers,
Jx

On Thu, Jul 30, 2009 at 8:00 PM, Matthias Meyer matthias.me...@gmx.liwrote:

 Vetch wrote:

  Hi Matthias,
  All my xferlogs say they have 0 errors (apart from one, but that was
 after
  the problem occurred anyway)...
  I've had a look at them, but they are... quite long...
  Without knowing what to search for, I'm not sure what I can do with
  them... If they report no errors, I guess I can assume all files are
  backing up properly?
  Xfer Error Summary
 
 
  Backup# Type View #Xfer errs #bad files #bad share #tar errs
  0 full XferLOG, Errors 0 0 0 0
  28 full XferLOG, Errors 0 0 0 0
  56 full XferLOG, Errors 0 0 0 0
  84 full XferLOG, Errors 0 0 0 0
  112 full XferLOG, Errors 0 0 0 0
  126 full XferLOG, Errors 0 0 0 0
  133 full XferLOG, Errors 0 0 0 0
  140 full XferLOG, Errors 0 0 0 0
  147 full XferLOG, Errors 0 0 0 0
  150 incr XferLOG, Errors 0 0 0 0
  151 incr XferLOG, Errors 0 0 0 0
  152 incr XferLOG, Errors 0 0 0 0
  153 full XferLOG, Errors 1 0 0 0
  154 incr XferLOG, Errors 0 0 0 0
  155 incr XferLOG, Errors 0 0 0 0
  156 incr XferLOG, Errors 0 0 0 0
 
  Thanks,
  Jx

 That is unbelievable.
 You backup a windows client, right?
 There should be a lot of files which can not be backuped because they are
 in
 use.
 Do you use volume shadow copies in windows?

 But nevertheless, you have backups of all files specified in your
 configuration.
 If you check your backup include/exclude configuration you should find
 which
 files are not backuped.

 br
 Matthias
 --
 Don't Panic



 --
 Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day
 trial. Simplify your report design, integration and deployment - and focus
 on
 what you do best, core application coding. Discover what's new with
 Crystal Reports now.  http://p.sf.net/sfu/bobj-july
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/

--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with 
Crystal Reports now.  http://p.sf.net/sfu/bobj-july___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Restore issues

2009-08-05 Thread Vetch
Hi Jeffrey,
Sounds doable, but as I say, I'm backing a Linux server, not a Win client,
so I don't think I need to worry too much about the busy files...
Thanks for the suggestion though...
Cheers,
Jx

On Thu, Jul 30, 2009 at 8:19 PM, Jeffrey J. Kosowsky
backu...@kosowsky.orgwrote:

 Matthias Meyer wrote at about 21:00:54 +0200 on Thursday, July 30, 2009:
   Vetch wrote:
  
Hi Matthias,
All my xferlogs say they have 0 errors (apart from one, but that was
 after
the problem occurred anyway)...
I've had a look at them, but they are... quite long...
Without knowing what to search for, I'm not sure what I can do with
them... If they report no errors, I guess I can assume all files are
backing up properly?
Xfer Error Summary
   
   
Backup# Type View #Xfer errs #bad files #bad share #tar errs
0 full XferLOG, Errors 0 0 0 0
28 full XferLOG, Errors 0 0 0 0
56 full XferLOG, Errors 0 0 0 0
84 full XferLOG, Errors 0 0 0 0
112 full XferLOG, Errors 0 0 0 0
126 full XferLOG, Errors 0 0 0 0
133 full XferLOG, Errors 0 0 0 0
140 full XferLOG, Errors 0 0 0 0
147 full XferLOG, Errors 0 0 0 0
150 incr XferLOG, Errors 0 0 0 0
151 incr XferLOG, Errors 0 0 0 0
152 incr XferLOG, Errors 0 0 0 0
153 full XferLOG, Errors 1 0 0 0
154 incr XferLOG, Errors 0 0 0 0
155 incr XferLOG, Errors 0 0 0 0
156 incr XferLOG, Errors 0 0 0 0
   
Thanks,
Jx
  
   That is unbelievable.
   You backup a windows client, right?
   There should be a lot of files which can not be backuped because they
 are in
   use.
   Do you use volume shadow copies in windows?

 Alternatively, you could just exclude the files that tend to be
 busy. Before I wrote my volume shadow copy script, I had a short list
 of excludes that eliminated all busy files.

  
   But nevertheless, you have backups of all files specified in your
   configuration.
   If you check your backup include/exclude configuration you should find
 which
   files are not backuped.
  
   br
   Matthias
   --
   Don't Panic
  
  
  
 --
   Let Crystal Reports handle the reporting - Free Crystal Reports 2008
 30-Day
   trial. Simplify your report design, integration and deployment - and
 focus on
   what you do best, core application coding. Discover what's new with
   Crystal Reports now.  http://p.sf.net/sfu/bobj-july
   ___
   BackupPC-users mailing list
   BackupPC-users@lists.sourceforge.net
   List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
   Wiki:http://backuppc.wiki.sourceforge.net
   Project: http://backuppc.sourceforge.net/
  


 --
 Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day
 trial. Simplify your report design, integration and deployment - and focus
 on
 what you do best, core application coding. Discover what's new with
 Crystal Reports now.  http://p.sf.net/sfu/bobj-july
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/

--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with 
Crystal Reports now.  http://p.sf.net/sfu/bobj-july___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Restore issues

2009-07-24 Thread Vetch
Hi Matthias,
All my xferlogs say they have 0 errors (apart from one, but that was after
the problem occurred anyway)...
I've had a look at them, but they are... quite long...
Without knowing what to search for, I'm not sure what I can do with them...
If they report no errors, I guess I can assume all files are backing up
properly?
Xfer Error Summary


Backup# Type View #Xfer errs #bad files #bad share #tar errs
0 full XferLOG, Errors 0 0 0 0
28 full XferLOG, Errors 0 0 0 0
56 full XferLOG, Errors 0 0 0 0
84 full XferLOG, Errors 0 0 0 0
112 full XferLOG, Errors 0 0 0 0
126 full XferLOG, Errors 0 0 0 0
133 full XferLOG, Errors 0 0 0 0
140 full XferLOG, Errors 0 0 0 0
147 full XferLOG, Errors 0 0 0 0
150 incr XferLOG, Errors 0 0 0 0
151 incr XferLOG, Errors 0 0 0 0
152 incr XferLOG, Errors 0 0 0 0
153 full XferLOG, Errors 1 0 0 0
154 incr XferLOG, Errors 0 0 0 0
155 incr XferLOG, Errors 0 0 0 0
156 incr XferLOG, Errors 0 0 0 0

Thanks,
Jx
On Fri, Jul 24, 2009 at 9:33 AM, Matthias Meyer matthias.me...@gmx.liwrote:

 You should check the backup log file XferLOG.
 Each file which can not backup will be logged theire, including the reason.

 br
 Matthias
 --
 Don't Panic



 --
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/

--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Fwd: Restore issues

2009-07-21 Thread Vetch
Hi,
Just for the record, e2fsck -f didn't throw up any problems...
I assume that means there were no issues and the device is clean?
Cheers,
Jx

-- Forwarded message --
From: Vetch vetch...@googlemail.com
Date: Tue, Jul 21, 2009 at 11:06 AM
Subject: Re: [BackupPC-users] Restore issues
To: General list for user discussion, questions and support 
backuppc-users@lists.sourceforge.net


Hi Matthias,
Thanks for getting back to me...
I've put my replies below...

On Mon, Jul 20, 2009 at 8:59 PM, Matthias Meyer matthias.me...@gmx.liwrote:

 Vetch wrote:

  Hi all,
 
  I have a question...
 
  I tried restoring a backup to my system - it kept on failing with aborted
  by signal=PIPE.
  I have a feeling it may have been related to corrupt files within the
  hardlinks, since when I tried restoring using tar, the tar files were not
  readable, and the zip files didn't have as many files in as I would have
  expected...
 
  I didn't seem to be able to resolve that, so I wiped the system (keeping
  the home directory in tact), and rebuilt the server.

 Which system? Your BackupPC server or your backup client?

My backup client - the backuppc server is still as was...
The client was running as (among other things) a samba file server with all
data stored in the home directory...


   Now, unfortunately, the software I use to run the file server decided to
  delete my home directories when I restored the configuration...
  Go figure...
  So now, I'm really worried because I think that I will have lost about
  100GB of data that I believed was backed up...
  Anyway, I'm trying to restore now, but the tar files are still coming up
  as unreadable...
 
  I guess if there are problems with the data, there's probably nothing
 that
  can be done?

 Probably not much :-(

Ok - I tried direct restores back into the original directories over the
network - and it came up with successful restores for all the home
directories...
Does this mean that the data has been fully successfully restored?
I think I have about 1000 files missing (out of about 35000)...
Now, this wouldn't be the end of the world, but I'd be interested to know if
when it reports success, it has definitely brought back the entire
dataset...
... and if so... do you have any suggestions as to why I may have different
numbers of files?


 
  How does backuppc check the consistency of the data once it's been added
  to the pool?
  I know that it only backs up the data once, but how does it make sure
 that
  the data is in a valid state?
  Is it checksums? If so, then I guess it can't be a corruption error...
  ... but I don't get why I would be finding directories of 3000 files
  showing only 1500 in a full backup...

 I didn't believe that backuppc will check data consistency. That is the job
 of a filesystem. e2fsck will do that for ext2/3 file systems. You should
 run it regulary.

Ok... (see below)...


 
  Can anyone help? Any support would be greatly appreciated...
  I've got all the coursework from a Masters and a PhD on there!
 
  Jx

 I assume you are running BackupPC on Linux and use ext3 as filesystem.
 So I would advise:
 1) make a image backup (e.g. partimage) of your /var/lib/backuppc or
 wherever your __TOPDIR__ resides.
 2) try to open some files (select files which are importend for you) from
 the BackupPC GUI (use your backup client, open http:\\your
 server/backuppc, login, left hand menu, Browse Backups, click on a file
 and say open)
 3) run e2fsck on the device where /var/lib/backuppc located (unmount the
 device first) and expect to lost data if e2fsck find file system errors.

 You should run e2fsck as soon as possible. Elsewere filesystem errors can
 grows and destroy more data then necessary.

I've just tried this - I booted to a live CD and e2fsck-ed the device...
On first scan, it reported clean... I'm now running a e2fsck -f to force it
to check, but assuming that it reports the device as clean, then can I
assume that the backups are not corrupted?
In which case, I have to wonder about the missing files...
Am I just worrying unneccessarily?


 I have had a similiar problem 7 month ago. A lot of file system errors and
 I
 lost most of my backups. Fortunately I didn't lost data on my backup
 clients at the same time.

Heh - yes, it's always a little concerning when your backup solution starts
playing up at exactly the same time that you lose all your data ;)
Luckily, I'm feeling fairly happy, because even if I've lost a couple of
files, it's definitely a lot lot better than it could have been!
Thanks for your help!
Jx



 br
 Matthias
 --
 Don't Panic



 --
 Enter the BlackBerry Developer Challenge
 This is your chance to win up to $100,000 in prizes! For a limited time,
 vendors submitting new applications to BlackBerry App World(TM) will have
 the opportunity to enter the BlackBerry Developer Challenge. See full prize
 details at: http://p.sf.net

Re: [BackupPC-users] Restore issues

2009-07-21 Thread Vetch
Hi Matthias,

Thanks for your help on this...

See replies below...

On Tue, Jul 21, 2009 at 6:40 PM, Matthias Meyer matthias.me...@gmx.liwrote:

 Vetch wrote:

 
  Ok - I tried direct restores back into the original directories over the
  network - and it came up with successful restores for all the home
  directories...
  Does this mean that the data has been fully successfully restored?

 Probably. But as I said, BackupPC didn't check file consistency.


Ok... So it could be that the files are corrupt in the database...
Though given the positive e2fsck results, that seems unlikely...



  I think I have about 1000 files missing (out of about 35000)...

 You think? Do you really miss one of this 1000 files?


No - I haven't noticed any missing files, so I'm not too worried
(realistically, I tend to keep numerous versions of any important document
I'm working on anyway, so...)

 Now, this wouldn't be the end of the world, but I'd be interested to know
 if when it reports success, it has definitely brought back the entire
 dataset...

Yes. But possible not all the files YOU expected in the backup dataset.


Ok... so backuppc managed to restore all the data it has available...
... but perhaps I didn't count properly, or alternatively, didn't backup
properly in the first place...



  ... and if so... do you have any suggestions as to why I may have
  different numbers of files?
 

 How do you measure the file counts?


I took the somewhat unscientific approach of using an offline backup and
using windows to count the files based on the properties...
Essentially, I had one of the folders stored as an offline backup (the one
with 35000-ish files in) which I synchronise most days...
I used the windows properties to count files in the folder (36026) and then
I took an archive (zip) copy of my offline directory which also 36026 files
in it, based on the archive file count...
I then connected to the server, synchronised and used the windows properties
on it again...
It showed (I can't remember now exactly), but I believe it was around
35200...
I then restored the archive and now the folder properties report 36093 (I
attribute the extra files to being ones which are not offline synchronisable
using the Windows XP offline files (e.g. .pst files, etc)...
Now, potentially, I guess that means that there could have just been more
files on my offline copy (I expected about 14 since the offline synch
claimed that 14 files had changed and needed to be synchronised)...
Equally, possibly there were files on my offline copy that couldn't be
copied to the server through synch and I didn't know...
... though I would have expected to have noticed previously...
Equally, it's possible that the backuppc user didn't have rights on the
server to backup all files on the server, but I was ssh-ing in and sudo-ing
the command, so I believe that should give it root access for the rsync
command...
I don't know - it just seems like there should have been more files...

Like I say, I'm not particualarly bothered, as I think it's highly unlikely
any of the files I genuinely need have been completely lost, but still...
I'd be interested in knowing what caused the discrepancy - if it's my
counting, my setup, my configuration or the system behaving strangely...



  I've just tried this - I booted to a live CD and e2fsck-ed the device...
  On first scan, it reported clean... I'm now running a e2fsck -f to force
  it to check, but assuming that it reports the device as clean, then can I
  assume that the backups are not corrupted?

 Yes!


Excellent... Well, that's good news...



  In which case, I have to wonder about the missing files...
  Am I just worrying unneccessarily?
 
 Probably. We didn't know yet if really files are missing or if your
 measurement is wrong.


Heh - it's probably my measurement, isn't it? ;)
Oh well - let's hope so ;)

Once again many thanks,

Jx


 br
 Matthias
 --
 Don't Panic



 --
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/

--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Restore issues

2009-07-20 Thread Vetch
Hi all,

I have a question...

I tried restoring a backup to my system - it kept on failing with aborted by
signal=PIPE.
I have a feeling it may have been related to corrupt files within the
hardlinks, since when I tried restoring using tar, the tar files were not
readable, and the zip files didn't have as many files in as I would have
expected...

I didn't seem to be able to resolve that, so I wiped the system (keeping the
home directory in tact), and rebuilt the server.
Now, unfortunately, the software I use to run the file server decided to
delete my home directories when I restored the configuration...
Go figure...
So now, I'm really worried because I think that I will have lost about 100GB
of data that I believed was backed up...
Anyway, I'm trying to restore now, but the tar files are still coming up as
unreadable...

I guess if there are problems with the data, there's probably nothing that
can be done?

How does backuppc check the consistency of the data once it's been added to
the pool?
I know that it only backs up the data once, but how does it make sure that
the data is in a valid state?
Is it checksums? If so, then I guess it can't be a corruption error...
... but I don't get why I would be finding directories of 3000 files showing
only 1500 in a full backup...

Can anyone help? Any support would be greatly appreciated...
I've got all the coursework from a Masters and a PhD on there!

Jx
--
Enter the BlackBerry Developer Challenge  
This is your chance to win up to $100,000 in prizes! For a limited time, 
vendors submitting new applications to BlackBerry App World(TM) will have
the opportunity to enter the BlackBerry Developer Challenge. See full prize  
details at: http://p.sf.net/sfu/Challenge___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] upgrading from 2.1.2 to 3.0.0

2008-05-22 Thread Vetch
Hi all,



Sorry to drag up an old thread...

I currently have a v.small business server running backuppc 2.1.2 on Debian
Etch that has a considerable amount of important data backed up on it.

I plan to move the backuppc installation onto a new server on Ubuntu 8.04,
and I would like to take the old data across.

I believe that the easiest way to do this would be to upgrade to Backuppc 3,
and use the BackupPC_tar_CCopy data export functionality to convert it into
a tar file (using the method from the online documentation (
http://backuppc.sourceforge.net/faq/BackupPC.html#other_installation_topics)
and then move the data...



What I am worried about is how I should go about updating the store...

I notice that in backports there is a version of 3.1 for Etch
http://backuppc.wiki.sourceforge.net/FAQ+installation

... But, I'm not sure how to go about using backports for just backuppc...

Realistically, in some ways, it's not really a huge problem, because I'm
going to retire the server once I've copied off the data, but still, it may
be some time between moving data and finally retiring the server...



The other worry I have is how the upgrade will affect the data... Can I just
use apt-pinning to install the backuppc install, and it'll all happen
seamlessly? Or do I need to worry about anything?



And (sorry) finally, I notice that the data is saved into the /var volume...

I am currently using the intended server as a file server too, and have set
up most of the HDD space as on /home. I could probably change this, but it
would rather have the backuppc data on the home drive (though I realise it
flies in the face of all hierarchical file system planning and design)...



Can anyone give me some pointers on how I should go about achieving these
goals, and potential issues I am looking at...



Many thanks,



Jx





-Original Message-
From: [EMAIL PROTECTED] [mailto:
[EMAIL PROTECTED] On Behalf Of Paul Archer
Sent: 05 November 2007 04:25
To: dan
Cc: BackupPC users' mailing list
Subject: Re: [BackupPC-users] upgrading from 2.1.2 to 3.0.0



9:19pm, dan wrote:



 betas are betas.  if this is a test setup, go ahead and run the beta, but
if

 this is to hol;d critical data, use the stable.  if you are moving to
ubuntu

 7.10, then just install ubuntu and `apt-get install backuppc` and you will

 be set.



 i will also point out to you that if you will be storing files somewhere

 other than the default(ubuntu) /var/lib/backuppc, maybe consider mounting

 that volume on /var/lib/backuppc, otherwise some of the status info in the

 gui is missing.



Good tip, thanks.





 also, if you want, you can pull bpc3.0 from ubuntu backports instead of

 upgrading.  i personally just moved a few machines to 7.10 server and im

 very happy.



I've upgraded a few machines too, with only a couple minor problems.



I enabled backports on that machine, but aptitude is still showing only the

old version. Maybe I need to go back and kick it.



Paul





 On 11/4/07, Paul Archer [EMAIL PROTECTED] wrote:



 I installed backuppc on a (K)Ubuntu 7.04 machine, not realizing that I
was

 getting version 2.1.2. I plan on upgrading the machine to 7.10, and

 upgrading backuppc to 3.0.0.

 Two questions:



 1) Is there anything particular I should worry about or watch out for

 after

 the upgrade?



 2) Should I use the packaged 3.0.0, or is 3.1.0beta1 worth going to?
(This

 is for home use--important files (to me) but not a mission critical

 situation or anything.)



-

This SF.net email is sponsored by: Splunk Inc.

Still grepping through log files to find problems?  Stop.

Now Search log events and configuration files using AJAX and a browser.

Download your FREE copy of Splunk now  http://get.splunk.com/

___

BackupPC-users mailing list

BackupPC-users@lists.sourceforge.net

List:https://lists.sourceforge.net/lists/listinfo/backuppc-users

Wiki:http://backuppc.wiki.sourceforge.net

Project: http://backuppc.sourceforge.net/
-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Advice on BackupPC

2007-05-16 Thread Vetch

Hi all,

I've just found BackupPC, and I was wondering if it will achieve what I need
it to?

I have a two site network, one in the US, and one in the UK.
Our bandwidth is limited, though will be increasing at some point in the
future, though I couldn't say how much...
I want to backup my data from one site to the other...
In order to assess whether that would be do-able, I went to an exhibition of
backup technologies.
One that caught my eye was a company called Data Domain, who claimed to
de-duplicate data at the block level of 16KB chunks...
Apparently, all they send are the changed chunks and the schema to retrieve
the data.

What I am wondering is would BackupPC be a suitable open source replacement
for that technology...?
Does it send the changed data down the line and then check to see if it
already has a copy, or does it check then send?
Presumably it would save significant bandwidth if it checks first...
The other thing is, can BackupPC de-duplicate at the block level or is it
just file level?
I'm thinking that block level might save considerable amounts of traffic,
because we will need to send file dumps of Exchange databases over the
wire...
... Which I assume will mean that we've got about 16GB at least to copy
everyday, since it'll be creating a new file daily...

On the other hand, would 16KB blocks be duplicated that regularly - I
imagine there is a fair amount of variability in 16KB of ones and zeros, and
the chances of them randomly reoccurring without being part of the same
file, I would say are slim...

What do you think?

Any help would be greatly appreciated?

Jx
-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Advice on BackupPC

2007-05-16 Thread Vetch

Hi Les,

Thanks for the info...

Sounds like an incredibly powerful tool!

See responses below:-

On 5/16/07, Les Mikesell [EMAIL PROTECTED] wrote:


Vetch wrote:

 I have a two site network, one in the US, and one in the UK.
 Our bandwidth is limited, though will be increasing at some point in the
 future, though I couldn't say how much...
 I want to backup my data from one site to the other...
 In order to assess whether that would be do-able, I went to an
 exhibition of backup technologies.
 One that caught my eye was a company called Data Domain, who claimed to
 de-duplicate data at the block level of 16KB chunks...
 Apparently, all they send are the changed chunks and the schema to
 retrieve the data.

Backuppc can use rsync to transfer the data.  Rsync works by reading
through the file at both ends, exchanging block checksums to find the
changed parts.



Ok - so Rsync sounds like the format to use...


What I am wondering is would BackupPC be a suitable open source
 replacement for that technology...?
 Does it send the changed data down the line and then check to see if it
 already has a copy, or does it check then send?

It can do either, depending on whether you use the tar, smb, or rsync
transfer methods.



The Rsync method presumably from your previous comment would check then
send...?


Presumably it would save significant bandwidth if it checks first...
 The other thing is, can BackupPC de-duplicate at the block level or is
 it just file level?
 I'm thinking that block level might save considerable amounts of
 traffic, because we will need to send file dumps of Exchange databases
 over the wire...
 ... Which I assume will mean that we've got about 16GB at least to copy
 everyday, since it'll be creating a new file daily...

 On the other hand, would 16KB blocks be duplicated that regularly - I
 imagine there is a fair amount of variability in 16KB of ones and zeros,
 and the chances of them randomly reoccurring without being part of the
 same file, I would say are slim...

 What do you think?

I think rsync will do it as well as it can be done.  However, it is hard
to tell how much two different Exchange database dumps will have in
common.  Then there is the issue that you could reduce the size by
compressing the file but doing so will make the common parts impossible
to find from one version to another.  You can work around this by using
ssh compression or something like an openvpn tunnel with lzo compression
enabled, leaving the file uncompressed.



I see - so you wouldn't compress the file, you'd compress the tunnel...
Makes sense...
Would it then still get compressed when stored at the other end?

You can test the transfer efficiency locally first to get an idea of how

well the common blocks are handled.  Use the command line rsync program
to make a copy of one days's dump, then repeat the process the next day
with the same filename.   Rsync will display the size of the file and
the data actually transferred.



So I would output a copy of the database to the same file name, and rsync
would just take the changes...
I'll try it out...

How well would that work for something like LVM snapshotting?
I'm thinking of migrating my windows servers to Xen Virtual Machines on LVM
drives
If I take a snapshot of the drive and then mount it somewhere, could I get
BackupPC to copy only the changed data as rsynch files?

With regards to the storage - does it keep copies of all the versions of the
file that is backed up, with differences stored and are they separated into
chunks at that level, or are they stored as distinctive files?

Cheers,

Jx


--

   Les Mikesell
[EMAIL PROTECTED]

-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/