On 14.01.19 10:16, Chris Albertson wrote:
> Here are the classic failures:
> 
>  1) You have a "backup drive" the mirrors all your data.   SO after
> spending a few hours edits files yu save the file but the software has
> a bug and writes a corrupted file to disk.   the yo "backup" the file
> and this over writes the only good copy of the file, the one you
> backed up yesterday.    So not the file and the backup are corrupted.
>  Solution:  NEVER over write a backup. only save the CHANGES.   You
> need to be able to go back in time and pull out the last working
> version

In reality, there's always more than one way to eat an elephant, so
there's no real need to complicate life by making backups a growing
agglomeration of deltas¹. The nifty and very *nixy rsync utility solves
the write corruption problem directly. Rsync always verifies that each
transferred file was correctly reconstructed on the receiving side, by
checking a whole-file checksum that is generated as the file is
transferred, so problem "1)" does not exist.

To avoid another clobber problem - overwriting a newer file backed up
from e.g. my laptop while out at the farm, the -u option protects.
And backup to a bunch of usb sticks gives me recovery steps over recent
history and off-site backup as they go with me when I leave the
building. 

That said, there is another corruption problem common to all backup
methods. I've had a usb stick corrupt dozens of bytes in a few files
over time. If the source files have not changed, they are not rewritten,
and the backup utility is unaware of the media deterioration. For that
reason, I run a "diff -qr" on each directory tree after each backup.

As I don't use --delete, rsync won't delete a file from the backup just
because it's been deleted on the host. That preserves the backup in the
event of an inadvertently clobbered file. As any file difference shows up
in the "diff -qr", I'm prompted to recover an inadvertent loss,
manually propagate a deliberate deletion to the backup media, or if it's
late, go to bed and figure it out it later.

> 2) You have a backup disk and it make automated saves every hour and
> keeps a version history because you read the above.  But lightening
> strikes a power pole 1/4 mile from your house and the surge valve
> destroys the computer AND the backup disk. Solution: Keep a redundant
> copy of the data off-line.   This done not need to be quite as up to
> date but it should not be "live".  Store it in some other room and no
> connected to power data cables

Indeed. If data exists in only one place, then before long it'll exist
only in your imagination.

I reformat the usb sticks as ext3, as vfatty stuff clogs my arteries.

> 3) ...
> 
> A basic url of thumb for data that is NOT business critical is that
> "At all times, even during a a backup, that data shall exist on at
> least three different physical media and in at least two different
> geographical locations".    For business critical data, that means
> data that yu need to ear a living that can't be replaced, increment
> those numbers by one.   Four copies of the data in three different
> locations.

Very sound advice, and still best practice, I think. I'm taking one
ute-load (Am: small pickup-load) of klamotten (En: stuff) out to a
shipping container on the farm each month, and adding a usb stick to the
load is warranted as I'd rather lose a load of physical goodies than
irreplaceable data.

Mind you, if the usb stick in my pocket doesn't make it out, then the
need for it is much reduced. The Black Saturday fires here released the
energy of 1500 Hiroshima nuclear explosions, with over 2,000 homes
incinerated to the point of the bricks shattering and Al engine blocks
running like water. California seems to be equally exposed to the
problem now, and elsewhere in the world catastrophic firestorms are
increasingly a foreseeable risk now.
 
Erik     (Top temp in the state today: 46°C/114.8°F, a cooler 37°C/98.6°F here.
          Gippsland rainfall lowest ever: 245 mm for the year, dams
          empty, so firefighting ability significantly reduced.)

¹ That's the job of a version control system, best practice for software
  development, very useful for system config files, and handy for text
  documents. But the job of backup is to archive stuff, including the
  VCS's files if one is used. (I still stick with venerable CVS, as
  there is no reason to adopt latest fashions here.)


_______________________________________________
Emc-users mailing list
Emc-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/emc-users

Reply via email to