Re: Backup strategies

2009-02-04 Thread Dan Harnett
On Tue, Feb 03, 2009 at 04:29:41PM -0500, Jonathan Thornburg wrote:
 Etienne Robillard robillard.etienne () gmail ! com wrote
  i kinda like cpio for fast backup of filesystems... for large media
  files (think anime movies) -- I think its generally best to just
  burn them on a iso..
 
 I have found rsync to an external usb hard disk to work very nicely;
 these are now cheap and readily available up to over a terabyte.
 Here are a few notes from my experience using this strategy for the
 past several years:

I do the same for my laptop.  I use a drive compatible with my laptop in
an USB enclosure.  I partition the USB disk identical to the one in my
laptop and use rsync to clone the data.  Should the drive in my laptop
fail, I can just pop the disk out of the USB enclosure and into the
laptop.  It's also possible to just boot off the USB disk.

  #!/bin/sh
  set -x
  rsync -aHESvv --delete \
--exclude '/home/jonathan/crypt/*' \
--exclude '/mnt/oxygen/home/jonathan/crypt/*' \
/home/jonathan/ /mnt/oxygen/home/jonathan/
   This works fine except that the --exclude options are not honored
   (files under those directories are still copied).  I don't know what's
   wrong there...

They are honored.  The path is relative.  You're actually excluding
'/home/jonathan/home/jonathan/crypt/*', etc.

  rsync -aHESvv --delete --exclude '/crypt/*' \
/home/jonathan/ /mnt/oxygen/home/jonathan/

This link[1] and rsnapshot in ports may also be of interest to some.

[1] http://www.mikerubel.org/computers/rsync_snapshots/



Re: Backup strategies

2009-02-04 Thread Daniel A. Ramaley
On Tuesday February 3 2009 21:16, you wrote:
 rsync -aHESvv --delete \
   --exclude '/home/jonathan/crypt/*' \
   --exclude '/mnt/oxygen/home/jonathan/crypt/*' \
   /home/jonathan/ /mnt/oxygen/home/jonathan/
  This works fine except that the --exclude options are not honored
  (files under those directories are still copied).  I don't know
 what's wrong there...

[...]

how about using double-quotes instead? for eg., --exclude
/home/jonathan/crypt/*. your shell might be preventing rsync from
looking what's inside the quotes...

I think rsync needs to see the asterisks, not the shell. So single 
quotes are correct. In my own scripts, when i wanted to exclude a 
directory i used to specify just the directory with no wildcard and it 
worked. Example:
--exclude '/home/jonathan/crypt'
However, that would also exclude /home/jonathan/crypt2 as collateral 
damage. This format is what i use now and does what i think you want:
--exclude '/home/jonathan/crypt/**'


Dan RamaleyDial Center 118, Drake University
Network Programmer/Analyst 2407 Carpenter Ave
+1 515 271-4540Des Moines IA 50311 USA



Re: Backup strategies

2009-02-03 Thread Jonathan Thornburg
Etienne Robillard robillard.etienne () gmail ! com wrote
 i kinda like cpio for fast backup of filesystems... for large media
 files (think anime movies) -- I think its generally best to just
 burn them on a iso..

I have found rsync to an external usb hard disk to work very nicely;
these are now cheap and readily available up to over a terabyte.
Here are a few notes from my experience using this strategy for the
past several years:
* With rsync, the initial backup does a full copy, but then future
  backups automatically only copy changed files.
* I found that performance went from painfully slow to ok when I
  switched my external disks from ext2fs to ffs mounted softdep,noatime.
* I have had no problems with single files as big as 5 GB.
* For extra disaster-insurance I actually use a pair of external disks,
  one at home and one at my office.  I swap them every week or so.
* Backups can be a security risk, since anyone who steals the backup
  medium has instant access to all the files stored there.  This is a
  great use for encrypting filesystems, eg svnd, raidctl, or cfs (ports).
* Backups need to be hassle-free and as tired-system-administrator--proof
  as possible, so it's good to script the process.  I use scripts like
  the following:
 #!/bin/sh
 set -x
 rsync -aHESvv --delete \
   --exclude '/home/jonathan/crypt/*' \
   --exclude '/mnt/oxygen/home/jonathan/crypt/*' \
   /home/jonathan/ /mnt/oxygen/home/jonathan/
  This works fine except that the --exclude options are not honored
  (files under those directories are still copied).  I don't know what's
  wrong there...

-- 
-- Jonathan Thornburg [remove -animal to reply] 
jth...@astro.indiana-zebra.edu
   Dept of Astronomy, Indiana University, Bloomington, Indiana, USA
   Washing one's hands of the conflict between the powerful and the
powerless means to side with the powerful, not to be neutral.
  -- quote by Freire / poster by Oxfam



Re: Backup strategies

2009-02-03 Thread Amarendra Godbole
On Wed, Feb 4, 2009 at 2:59 AM, Jonathan Thornburg
jth...@astro.indiana.edu wrote:
 Etienne Robillard robillard.etienne () gmail ! com wrote
 i kinda like cpio for fast backup of filesystems... for large media
 files (think anime movies) -- I think its generally best to just
 burn them on a iso..

 I have found rsync to an external usb hard disk to work very nicely;
 these are now cheap and readily available up to over a terabyte.
 Here are a few notes from my experience using this strategy for the
 past several years:
 * With rsync, the initial backup does a full copy, but then future
  backups automatically only copy changed files.
 * I found that performance went from painfully slow to ok when I
  switched my external disks from ext2fs to ffs mounted softdep,noatime.
 * I have had no problems with single files as big as 5 GB.
 * For extra disaster-insurance I actually use a pair of external disks,
  one at home and one at my office.  I swap them every week or so.

thanks. this gives me some pointers to implement a better backup
strategy. i also use a similar setup, except that i don't have
multiple disks (no backup for the backup).

 * Backups can be a security risk, since anyone who steals the backup
  medium has instant access to all the files stored there.  This is a
  great use for encrypting filesystems, eg svnd, raidctl, or cfs (ports).
 * Backups need to be hassle-free and as tired-system-administrator--proof
  as possible, so it's good to script the process.  I use scripts like
  the following:
 #!/bin/sh
 set -x
 rsync -aHESvv --delete \
   --exclude '/home/jonathan/crypt/*' \
   --exclude '/mnt/oxygen/home/jonathan/crypt/*' \
   /home/jonathan/ /mnt/oxygen/home/jonathan/
  This works fine except that the --exclude options are not honored
  (files under those directories are still copied).  I don't know what's
  wrong there...
[...]

how about using double-quotes instead? for eg., --exclude
/home/jonathan/crypt/*. your shell might be preventing rsync from
looking what's inside the quotes...

-amarendra



Re: Backup strategies

2009-02-01 Thread Khalid Schofield

On 31 Jan 2009, at 06:36, Predrag Punosevac wrote:


Dear All,

I am seeking advice about the backup strategies and possible use
of CVS to accomplish this task.

I happen to use 4-5 different computer on the daily basis for my work.
I use my laptop, desktop, and a file server at work as well as my
personal
desktop and my wife's laptop at home.
It is of paramount importance for me that my files are in sync on all
5 computers
for two reasons. I want to start working always with the latest and
most up to date version of files regardless of the computer which I
am using.
Secondly, if a HDD dies on one or even three-four computers at the
same moment
of time I will still have backup copy to recover the work.

Up until now I have used the combination of tar, rarely dd, and my
home brown scripts
to accomplish above task. I would always start work by running the
script which would
pull up the tar files either from the file server of USB drive and
untar them on my computer.
After I finish work I would run the script to tar specific directory
I was working on  and push
them back to file server and a USB drive.

However it did happen to me that I forgot to run the script once or
twice in the past which
cause me great deal of frustration. Suddenly, I would have to
different versions of the
same file at two different computers and maybe the third older version
on my file server.
It also happen to me in the bast that I modify the files and I
realized that modification
sucked but I could not recover specific older version of particular
file.
I do periodically burn DVDs with entire home directory, date it and
keep it on the shelf.

Are there any advantages of using CVS over my present method or I am
just hallucinating.
It looks to me that CVS could help me utilize pull+push strategy for
backing up the files but
would give me advantage over the tar and dd by allowing me incremental
updates as well as
keeping the past snapshots of my work.

I have seen a thread about 2-3 months ago on misc in which there was a
similar question
by a OpenBSD user who wanted to keep his /etc on his firewall machines
up to date as
well as back up configuration files in the case of the disaster by
CVS.

I am open for any suggestions but I do have a strong preference for
the tools from the base
of the system. I noticed couple ports with poor man tools for
accomplishing above tasks.

Thanks,
Predrag



For a good backup solution for multiple machines I don't think you can
go wrong with amanda. www.amanda.org . I use this to backup 50Tb per
week!

If you want to use a version control system then I'd look at
subversion. It's far superior to CVS but a little less diehard unix.
CVS has been around for far longer and is sometimes inbuilt into
systems but I'd give subversion ago.

For my home web server I bought a DLT8000 tape drive off ebay for #20
and bought a 2nd one just incase this fails. I use dump to backup to
tape weekly.



Khalid



Re: Backup strategies

2009-02-01 Thread Toni Mueller
Hi,

On Sat, 31.01.2009 at 14:04:32 +, Dieter open...@sopwith.solgatos.com 
wrote:
 ISO files have a 2 GB filesize limit, so large files don't fit.

are you sure?

I can fetch files that are well over 4GB and burn them on DVD. These
files are called as ISO files, but I don't know exactly what's inside
of these files. Sample file:

 ftp://ftp.gwdg.de/linux/knoppix/dvd/KNOPPIX_V5.3.1DVD-2008-03-26-EN.iso

(4342594 KB)

I never tried to burn a CD or DVD under OpenBSD, though.

 Backing up the big stuff is problematic.

Right.


Kind regards,
--Toni++



Re: Backup strategies

2009-02-01 Thread Matthew Szudzik
On Sun, Feb 01, 2009 at 12:12:50PM +0100, Toni Mueller wrote:
 I can fetch files that are well over 4GB and burn them on DVD. These
 files are called as ISO files, but I don't know exactly what's inside

See

 
http://en.wikipedia.org/wiki/ISO_9660#The_4_GiB_.28or_2_GiB_depending_on_implementation.29_file_size_limit

Some operating systems can handle files up to 4GB on an ISO 9660
filesystem, and other operating systems can handle more than 4GB.  But
if you want your ISO 9660 filesystem to be fully portable, you should
stick to the 2GB limit.



Re: Backup strategies

2009-02-01 Thread Matthew Szudzik
On Sat, Jan 31, 2009 at 02:04:32PM +, Dieter wrote:
  i kinda like cpio for fast backup of filesystems... for large media
  files (think anime movies) -- I think its generally best to just
  burn them on a iso..
 
 ISO files have a 2 GB filesize limit, so large files don't fit.

I use
 pax -w -B 150k -x cpio
to avoid the 2GB filesize limit in ISO filesystems.  The command splits
the cpio archive into 1.5GB files--exactly three of these files will fit
on a standard DVD.



Re: Backup strategies

2009-02-01 Thread Toni Mueller
On Sun, 01.02.2009 at 13:01:52 +, Matthew Szudzik mszud...@andrew.cmu.edu 
wrote:
 See
  
 http://en.wikipedia.org/wiki/ISO_9660#The_4_GiB_.28or_2_GiB_depending_on_implementation.29_file_size_limit

Thanks for the heads-up, but

 Some operating systems can handle files up to 4GB on an ISO 9660
 filesystem, and other operating systems can handle more than 4GB.  But
 if you want your ISO 9660 filesystem to be fully portable, you should
 stick to the 2GB limit.

if I'm not mistaken, quite a bit of software today comes on DVDs,
crammed to the brim. So I wonder whether the standard has been
extended, whether there's a convention about how to deal with larger
files, or whether it's sheer accident that it works.

Besides, having media types that can't be fully utilized is neither
useful nor acceptable, imho, but the solution can't be make only
smaller media.


Kind regards,
--Toni++



Re: Backup strategies

2009-02-01 Thread Pierre Riteau

On 1 fivr. 09, at 18:11, Toni Mueller wrote:


On Sun, 01.02.2009 at 13:01:52 +, Matthew Szudzik

mszud...@andrew.cmu.edu

 wrote:

See


http://en.wikipedia.org/wiki/ISO_9660#The_4_GiB_.28or_2_GiB_depending_on_impl
ementation.29_file_size_limit


Thanks for the heads-up, but


Some operating systems can handle files up to 4GB on an ISO 9660
filesystem, and other operating systems can handle more than 4GB.
But
if you want your ISO 9660 filesystem to be fully portable, you should
stick to the 2GB limit.


if I'm not mistaken, quite a bit of software today comes on DVDs,
crammed to the brim. So I wonder whether the standard has been
extended, whether there's a convention about how to deal with larger
files, or whether it's sheer accident that it works.

Besides, having media types that can't be fully utilized is neither
useful nor acceptable, imho, but the solution can't be make only
smaller media.


You seem to be mistaken.
The 4GB file limitation is for files *INSIDE* an ISO file system, not
for the ISO itself.
You can use the UDF format to store larger files (and avoid other
limitations too, like filename length), but it might not be as portable
as an ISO file system.



Re: Backup strategies

2009-02-01 Thread Matthew Szudzik
On Sun, Feb 01, 2009 at 06:34:31PM +0100, Pierre Riteau wrote:
 The 4GB file limitation is for files *INSIDE* an ISO file system, not
 for the ISO itself.

Exactly!  It's OK to have a DVD crammed to the brim with 4.7GB of
data, as long as the individual files on the DVD are each under the
filesize limit dictated by the operating system.  One big 4.7GB file
will probably exceed the limit, but a lot of little files whose TOTAL
SIZE is 4.7GB is OK.



Re: Backup strategies

2009-02-01 Thread Duncan Patton a Campbell
On Sat, 31 Jan 2009 01:36:49 -0500
Predrag Punosevac punoseva...@gmail.com wrote:

 Dear All,
 
 I am seeking advice about the backup strategies and possible use
 of CVS to accomplish this task.
 

Just a suggestion but, given that

1. you want the latest revs available to use on any of several machines,
2. these machines run a variety of OS
3. you want fully restorable data

you might consider having two layers of backup, such that

a. you have a full raid 1 (mirror) array with 3 identical disks,
   one of which you swap out periodically with associated restoration
   of the latest image to the swapped in disk,
b. you share the same area of this server using NFS and SMB, with
   similar permissions,
c. consider using git, and move the .git subdirs into symlinks
   with the origins on your raid.  

This is not something I've implemented in completeness but have
been considering by parts as a kind of generalized solution to SMB
requirements.  On the other hand I keep hearing that these requirements
will be elevated to the clouds regardless of the lack of visibility ;-)

Dhu



Re: Backup strategies

2009-02-01 Thread Toni Mueller
Hi,

On Sun, 01.02.2009 at 18:34:31 +0100, Pierre Riteau pierre.rit...@gmail.com 
wrote:
 You seem to be mistaken.

yes. Thanks to all of you, and note to self: Don't post when
tired and distracted...


Kind regards,
--Toni++



Re: Backup strategies

2009-02-01 Thread (private) HKS
On Sat, Jan 31, 2009 at 6:17 PM, Jason Dixon ja...@dixongroup.net wrote:
 There have been plenty of comments about distributed rcs systems.  I
 have no complaints there at all, but I wanted to mention Bacula as a
 solid backup software option.  We use it for our production needs in the
 office and colocation facility and I use it at home for my personal
 stuff.  Works very well and Mike Erdely has done an excellent job with
 the port (sysutils/bacula).

 --
 Jason Dixon
 DixonGroup Consulting
 http://www.dixongroup.net/



I can (vehemently) second the Bacula recommendation for traditional
archive-style backups.

My reading of the OP's requirements seemed more along the lines of
managing edits of the same files on multiple machines, with the
possibility of rolling back to an older version if necessary. If I
misread this and he's looking more for data preservation, I know of no
more intuitive, self-managed, flexible backup system than Bacula.

-HKS



Re: Backup strategies

2009-01-31 Thread Jukka Ruohonen
On Sat, Jan 31, 2009 at 01:36:49AM -0500, Predrag Punosevac wrote:
 I am seeking advice about the backup strategies and possible use
 of CVS to accomplish this task.
 
 I happen to use 4-5 different computer on the daily basis for my work.
 I use my laptop, desktop, and a file server at work as well as my personal
 desktop and my wife's laptop at home.
 It is of paramount importance for me that my files are in sync on all
 5 computers
 for two reasons. I want to start working always with the latest and
 most up to date version of files regardless of the computer which I am using.
 Secondly, if a HDD dies on one or even three-four computers at the same moment
 of time I will still have backup copy to recover the work.

I have been using a comparable setup for years. Using a version control
system for the task is doable, but in my opinion somewhat cumbersome to use
in daily basis, especially if the data contains large files. While there are
special tools like unison, I have settled down with rsync and ssh. Easy to
use and maintain, but not a replacement for backups.

- Jukka.



Re: Backup strategies

2009-01-31 Thread Stuart Henderson
On 2009-01-31, Predrag Punosevac punoseva...@gmail.com wrote:
 I am seeking advice about the backup strategies and possible use
 of CVS to accomplish this task.

..

 I have seen a thread about 2-3 months ago on misc in which there
 was a similar question by a OpenBSD user who wanted to keep his
 /etc on his firewall machines up to date as well as back up
 configuration files in the case of the disaster by CVS.

CVS is great for text based config files, and has the advantage it's
in base. but it's slow at some things, and sometimes it can e.g. be
nice to have access to the whole revision history whichever machine
you're using..

you might find something like git or hg works better for you, or
you might find CVS is fine, or you might find your current method
is really the most appropriate to how you work.

it's really a personal thing, try some alternatives and see which
you get along with best.



Re: Backup strategies

2009-01-31 Thread Etienne Robillard
On Sat, 31 Jan 2009 13:03:11 + (UTC)
Stuart Henderson s...@spacehopper.org wrote:

 On 2009-01-31, Predrag Punosevac punoseva...@gmail.com wrote:
  I am seeking advice about the backup strategies and possible use
  of CVS to accomplish this task.
 
 ..
 
  I have seen a thread about 2-3 months ago on misc in which there
  was a similar question by a OpenBSD user who wanted to keep his
  /etc on his firewall machines up to date as well as back up
  configuration files in the case of the disaster by CVS.
 
 CVS is great for text based config files, and has the advantage it's
 in base. but it's slow at some things, and sometimes it can e.g. be
 nice to have access to the whole revision history whichever machine
 you're using..
 
 you might find something like git or hg works better for you, or
 you might find CVS is fine, or you might find your current method
 is really the most appropriate to how you work.
 
 it's really a personal thing, try some alternatives and see which
 you get along with best.

i kinda like cpio for fast backup of filesystems... for large media
files (think anime movies) -- I think its generally best to just
burn them on a iso..

cheers!

erob



Re: Backup strategies

2009-01-31 Thread Lars Noodén
Predrag Punosevac wrote:
 ...It is of paramount importance for me that my files are in sync on all
 5 computers...

Can you give more info about the nature of files you wish to keep in
sync?  System configuration, text processing, databases, executables, etc?

Are the files all text  i.e. xml / sgml / source code / config files?
or what?


-Lars



Re: Backup strategies

2009-01-31 Thread Rich Kulawiec
If you can set up a common data repository and if you can ensure that
you always update that repository when you're finished working on
computer A before moving to computer B, then that may be the best
method for keeping your working set of files synchronized.

If in addition to that you need revision control, which might not be
a bad idea given your description of what you're doing, then something
like CVS or subversion would probably be appropriate.

You should probably also investigate the capabilities of rsync,
which many people (including me) use to keep (for example) desktop
systems in sync with laptops.  A caveat: it's possible to do a great
deal of damage with rsync very rapidly, so you'll need to have the
self-discipline to ensure that your data source and data sink are what
you think they are, and in the state you think they are, before you run it.

---Rsk



Re: Backup strategies

2009-01-31 Thread (private) HKS
On Sat, Jan 31, 2009 at 1:36 AM, Predrag Punosevac
punoseva...@gmail.com wrote:
 Dear All,

 I am seeking advice about the backup strategies and possible use
 of CVS to accomplish this task.

 I happen to use 4-5 different computer on the daily basis for my work.
 I use my laptop, desktop, and a file server at work as well as my personal
 desktop and my wife's laptop at home.
 It is of paramount importance for me that my files are in sync on all
 5 computers
 for two reasons. I want to start working always with the latest and
 most up to date version of files regardless of the computer which I am using.
 Secondly, if a HDD dies on one or even three-four computers at the same moment
 of time I will still have backup copy to recover the work.

 Up until now I have used the combination of tar, rarely dd, and my
 home brown scripts
 to accomplish above task. I would always start work by running the
 script which would
 pull up the tar files either from the file server of USB drive and
 untar them on my computer.
  After I finish work I would run the script to tar specific directory
 I was working on  and push
 them back to file server and a USB drive.

 However it did happen to me that I forgot to run the script once or
 twice in the past which
 cause me great deal of frustration. Suddenly, I would have to
 different versions of the
 same file at two different computers and maybe the third older version
 on my file server.
 It also happen to me in the bast that I modify the files and I
 realized that modification
 sucked but I could not recover specific older version of particular file.
 I do periodically burn DVDs with entire home directory, date it and
 keep it on the shelf.

 Are there any advantages of using CVS over my present method or I am
 just hallucinating.
 It looks to me that CVS could help me utilize pull+push strategy for
 backing up the files but
 would give me advantage over the tar and dd by allowing me incremental
 updates as well as
 keeping the past snapshots of my work.

 I have seen a thread about 2-3 months ago on misc in which there was a
 similar question
 by a OpenBSD user who wanted to keep his /etc on his firewall machines
 up to date as
 well as back up configuration files in the case of the disaster by CVS.

 I am open for any suggestions but I do have a strong preference for
 the tools from the base
 of the system. I noticed couple ports with poor man tools for
 accomplishing above tasks.

 Thanks,
 Predrag




Mercurial would suit you nicely. It's distributed version control. so
you don't have to pull down the whole damn repository every time, it's
got a solid merge engine, and you can revert to versions pretty
easily. Simply clone the central repository onto each individual box,
and at the beginning of work run an update. At the end, commit and
push your changes back to central server.

-HKS



Re: Backup strategies

2009-01-31 Thread punosevac72
Lars Nood??n larsnoo...@openoffice.org wrote:

 Predrag Punosevac wrote:
  ...It is of paramount importance for me that my files are in sync on all
  5 computers...

 Can you give more info about the nature of files you wish to keep in
 sync?  System configuration, text processing, databases, executables, etc?

 Are the files all text  i.e. xml / sgml / source code / config files?
 or what?


 -Lars
@Lars
My bet Lars, I should have been more clear about it.
It is mixture of text files (.tex,.me,.csv,.c,.sh,.html) as
well as PostScript files of Images (diagrams etc). Occasionally I deal
with multimedia be it jpeg images or short annimations. 

@Marko
I booked marked your web-site. I will have a hard look at your work.


@-HKS
Point taken about mercurial. I will experiment with it. How good
is it with occasional image files? It is definitelly big plus that I can
look changes I made either in papers I am writing or grades (.csv) of
my student. 

Big thanks to all who took the time to respond to my message

Predrag



Re: Backup strategies

2009-01-31 Thread (private) HKS
On Sat, Jan 31, 2009 at 2:21 PM,  punoseva...@gmail.com wrote:
 @-HKS
 Point taken about mercurial. I will experiment with it. How good
 is it with occasional image files? It is definitelly big plus that I can
 look changes I made either in papers I am writing or grades (.csv) of
 my student.


It handles images just fine. I don't think it can store images by diff
(might be wrong, it does that with plenty of other filetypes), but it
certainly doesn't choke on them.

-HKS



Re: Backup strategies

2009-01-31 Thread Jason Dixon
There have been plenty of comments about distributed rcs systems.  I
have no complaints there at all, but I wanted to mention Bacula as a
solid backup software option.  We use it for our production needs in the
office and colocation facility and I use it at home for my personal
stuff.  Works very well and Mike Erdely has done an excellent job with
the port (sysutils/bacula).

-- 
Jason Dixon
DixonGroup Consulting
http://www.dixongroup.net/



Re: Backup strategies

2009-01-31 Thread Dieter
 i kinda like cpio for fast backup of filesystems... for large media
 files (think anime movies) -- I think its generally best to just
 burn them on a iso..

ISO files have a 2 GB filesize limit, so large files don't fit.
I switched to using FFS, which allows files as large as the media
will hold (4.7 GB for DVD) at the expense of reduced portability
to other piles of bits claiming to be an operation system.
I now make my small filesystems the exact same size as a DVD,
which allows making a backup on DVD which can be mounted and
accessed normally, easier than messing with restore or tar.

Backing up the big stuff is problematic.



Backup strategies

2009-01-30 Thread Predrag Punosevac
Dear All,

I am seeking advice about the backup strategies and possible use
of CVS to accomplish this task.

I happen to use 4-5 different computer on the daily basis for my work.
I use my laptop, desktop, and a file server at work as well as my personal
desktop and my wife's laptop at home.
It is of paramount importance for me that my files are in sync on all
5 computers
for two reasons. I want to start working always with the latest and
most up to date version of files regardless of the computer which I am using.
Secondly, if a HDD dies on one or even three-four computers at the same moment
of time I will still have backup copy to recover the work.

Up until now I have used the combination of tar, rarely dd, and my
home brown scripts
to accomplish above task. I would always start work by running the
script which would
pull up the tar files either from the file server of USB drive and
untar them on my computer.
 After I finish work I would run the script to tar specific directory
I was working on  and push
them back to file server and a USB drive.

However it did happen to me that I forgot to run the script once or
twice in the past which
cause me great deal of frustration. Suddenly, I would have to
different versions of the
same file at two different computers and maybe the third older version
on my file server.
It also happen to me in the bast that I modify the files and I
realized that modification
sucked but I could not recover specific older version of particular file.
I do periodically burn DVDs with entire home directory, date it and
keep it on the shelf.

Are there any advantages of using CVS over my present method or I am
just hallucinating.
It looks to me that CVS could help me utilize pull+push strategy for
backing up the files but
would give me advantage over the tar and dd by allowing me incremental
updates as well as
keeping the past snapshots of my work.

I have seen a thread about 2-3 months ago on misc in which there was a
similar question
by a OpenBSD user who wanted to keep his /etc on his firewall machines
up to date as
well as back up configuration files in the case of the disaster by CVS.

I am open for any suggestions but I do have a strong preference for
the tools from the base
of the system. I noticed couple ports with poor man tools for
accomplishing above tasks.

Thanks,
Predrag