Re: Continuous backup of critical system files

2009-08-25 Thread Modulok
 I'm setting up a firewall using FreeBSD 7.2 and thought that it may
 not be a bad idea to have a continuous backup for important files like
 pf and dnsmasq configurations. By continuous I mean some script that
 would be triggered every few minutes from cron to automatically create
 a backup of any monitored file if it was modified. I also have a full
 system backup in place that is executed daily (dump/restore to a
 compact flash card), so the continuous backup would really be for
 times when someone makes a mistake editing one of the config files and
 needs to revert it to a previous state.

 My initial thought was to create a mercurial repository at the file
 system root and exclude everything except for explicitly added files.
 I'd then run something like hg commit -m `date` from cron every 10
 minutes to record the changes automatically. Can anyone think of a
 better way to do this (existing port specifically for this purpose)?
 Obviously, I need a way to track the history of a file and revert to a
 previous state quickly. The storage of changes should be as
 size-efficient as possible.


Look into 'rsync', available in the ports collection.

Generally for a basic server, you make backup copies manually before
you edit something. It's a good habbit to get into:

# Make a quick backup:
cp rules.pf rules.pf.orig

# Then edit the original:
nano rules.pf

If you're doing some major messing around and don't like the manual
backup solution, look into 'subversion', in the ports collection. It
is a full-featured revision control system. It's used by most
developers (including the FreeBSD team.) You could setup a subversion
repository to store all of your config files. Make changes to them and
committ those changes back to the repository. Then if you make a bunch
of changes you don't like, simply checkout a previous revision. Its a
bit more work to setup, but if you're doing a lot of frequent
tinkering it might be worth it.

For general backups I use rsync on a dedicated backup server. This way
if I have to quickly restore something I can simply scp it back to the
production server in seconds. rsync is fast (after the initial backup)
as it only transvers the deltas (changes) in files. It automatically
sorts out who has changed and who needs backed up. You could configure
a cron job to run an rsync script every few minutes if you wanted.
That script could also contain a command to generate an incremental
copy of the entire backup directory using the -l (lowercase ell) flag.
This generates a hard-linked copy, which consumes no real additional
space. You can read all about it here:

http://www.sanitarium.net/golug/rsync_backups.html

Whatever you decide, best of luck!
-Modulok-
___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to freebsd-questions-unsubscr...@freebsd.org


Re: Continuous backup of critical system files

2009-08-24 Thread Matthias Apitz
El día Monday, August 24, 2009 a las 11:57:25AM -0400, Maxim Khitrov escribió:

 Hello all,
 
 I'm setting up a firewall using FreeBSD 7.2 and thought that it may
 not be a bad idea to have a continuous backup for important files like
 pf and dnsmasq configurations. By continuous I mean some script that
 would be triggered every few minutes from cron to automatically create
 a backup of any monitored file if it was modified. I also have a full
 system backup in place that is executed daily (dump/restore to a
 compact flash card), so the continuous backup would really be for
 times when someone makes a mistake editing one of the config files and
 needs to revert it to a previous state.
 
 My initial thought was to create a mercurial repository at the file
 system root and exclude everything except for explicitly added files.
 I'd then run something like hg commit -m `date` from cron every 10
 minutes to record the changes automatically. Can anyone think of a
 better way to do this (existing port specifically for this purpose)?
 Obviously, I need a way to track the history of a file and revert to a
 previous state quickly. The storage of changes should be as
 size-efficient as possible.

Hello,

We run in my company since many years a FreeBSD based firwall. All
modified config files like, rc.conf, ipf.rules, ... have always
been on some internal host in CVS, only modified there and SCP'ed to
the firewall to make the change there active. After some hardware fault
I was once able to do a bare metal restore of the firewall within an hour,
just installed the base system and copied over the config from CVS.

matthias

-- 
Matthias Apitz
t +49-89-61308 351 - f +49-89-61308 399 - m +49-170-4527211
e g...@unixarea.de - w http://www.unixarea.de/
People who hate Microsoft Windows use Linux but people who love UNIX use 
FreeBSD.
___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to freebsd-questions-unsubscr...@freebsd.org


Re: Continuous backup of critical system files

2009-08-24 Thread chris scott
2009/8/24 Maxim Khitrov mkhit...@gmail.com

 Hello all,

 I'm setting up a firewall using FreeBSD 7.2 and thought that it may
 not be a bad idea to have a continuous backup for important files like
 pf and dnsmasq configurations. By continuous I mean some script that
 would be triggered every few minutes from cron to automatically create
 a backup of any monitored file if it was modified. I also have a full
 system backup in place that is executed daily (dump/restore to a
 compact flash card), so the continuous backup would really be for
 times when someone makes a mistake editing one of the config files and
 needs to revert it to a previous state.

 My initial thought was to create a mercurial repository at the file
 system root and exclude everything except for explicitly added files.
 I'd then run something like hg commit -m `date` from cron every 10
 minutes to record the changes automatically. Can anyone think of a
 better way to do this (existing port specifically for this purpose)?
 Obviously, I need a way to track the history of a file and revert to a
 previous state quickly. The storage of changes should be as
 size-efficient as possible.

 - Max
 ___
 freebsd-questions@freebsd.org mailing list
 http://lists.freebsd.org/mailman/listinfo/freebsd-questions
 To unsubscribe, send any mail to 
 freebsd-questions-unsubscr...@freebsd.org


I rsync all my system files to a filer running zfs. I have a separate zfs fs
for every host and then I snapshot the fs after the rsync. We then keep 35
snapshots for retention as we do daily rsyncs.


You might want more of a rolling snapshot policy. Keep on for every 10 mins
of the last hour, then drop it to hourly for the next 6 hours, then daily,
then weekly etc

Works quite well. We have also found it  handy for forensics as well, when
we have had a fault
___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to freebsd-questions-unsubscr...@freebsd.org


Re: Continuous backup of critical system files

2009-08-24 Thread chris scott
2009/8/24 chris scott kra...@googlemail.com



 2009/8/24 Maxim Khitrov mkhit...@gmail.com

 Hello all,

 I'm setting up a firewall using FreeBSD 7.2 and thought that it may
 not be a bad idea to have a continuous backup for important files like
 pf and dnsmasq configurations. By continuous I mean some script that
 would be triggered every few minutes from cron to automatically create
 a backup of any monitored file if it was modified. I also have a full
 system backup in place that is executed daily (dump/restore to a
 compact flash card), so the continuous backup would really be for
 times when someone makes a mistake editing one of the config files and
 needs to revert it to a previous state.

 My initial thought was to create a mercurial repository at the file
 system root and exclude everything except for explicitly added files.
 I'd then run something like hg commit -m `date` from cron every 10
 minutes to record the changes automatically. Can anyone think of a
 better way to do this (existing port specifically for this purpose)?
 Obviously, I need a way to track the history of a file and revert to a
 previous state quickly. The storage of changes should be as
 size-efficient as possible.

 - Max
 ___
 freebsd-questions@freebsd.org mailing list
 http://lists.freebsd.org/mailman/listinfo/freebsd-questions
 To unsubscribe, send any mail to 
 freebsd-questions-unsubscr...@freebsd.org


 I rsync all my system files to a filer running zfs. I have a separate zfs
 fs for every host and then I snapshot the fs after the rsync. We then keep
 35 snapshots for retention as we do daily rsyncs.


 You might want more of a rolling snapshot policy. Keep on for every 10 mins
 of the last hour, then drop it to hourly for the next 6 hours, then daily,
 then weekly etc

 Works quite well. We have also found it  handy for forensics as well, when
 we have had a fault


i forgot to say it need not be a zfs backend just a fs that you can reliably
do snapshots
___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to freebsd-questions-unsubscr...@freebsd.org


Re: Continuous backup of critical system files

2009-08-24 Thread Erik Norgaard

Maxim Khitrov wrote:


I'm setting up a firewall using FreeBSD 7.2 and thought that it may
not be a bad idea to have a continuous backup for important files like
pf and dnsmasq configurations. By continuous I mean some script that
would be triggered every few minutes from cron to automatically create
a backup of any monitored file if it was modified.

...

so the continuous backup would really be for times when someone makes
a mistake editing one of the config files and needs to revert it to
a previous state.


It appears to me that you review your procedures rather than deploying 
such a backup solution. Critical files rarely change (or should rarely 
be modified), there should be no need to backup every 10 minutes.


The more critical the file and the change applied the more testing 
should be done beforehand and the more care should be taken during the 
process to ensure that the original can easily be reinstated. You don't 
want to spend time digging it up from some backup. If your files are 
very critical then you should have a cvs repository in place as well as 
a testing environment. I guess this is not the case.


If they are less critical then good practices are the way to go: Before 
modifying anything create a backup in the same location, I add a serial 
number rather than .bak, .old, .tmp, .new etc which is really confusing. 
I use, .MMDDXX, and .orig for the original/default file. It's easy 
to see when a file was modified and make diffs with the original and 
also delete old backups this way, with .old you really have no 
continuity, you can't name your next backup .older.


Further, for small tweaks, I comment/uncomment parameters and apply 
these for fast testing from another session, so I don't even exit the 
editor. Certainly, I may save and test the file multiple times while 
tweaking, but in the end, there are only two files worth keeping: the 
last stable and the current.


Of course, I'm not saying it's a bad idea to keep backups, only that if 
you find a need to continuously backup files as mentioned, then you 
should review your procedures.


See also the current thread on what should be backed up.

BR, Erik

--
Erik Nørgaard
Ph: +34.666334818/+34.915211157  http://www.locolomo.org
___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to freebsd-questions-unsubscr...@freebsd.org


Re: Continuous backup of critical system files

2009-08-24 Thread Roland Smith
On Mon, Aug 24, 2009 at 11:57:25AM -0400, Maxim Khitrov wrote:
 Hello all,
 
 I'm setting up a firewall using FreeBSD 7.2 and thought that it may
 not be a bad idea to have a continuous backup for important files like
 pf and dnsmasq configurations.
snip
 My initial thought was to create a mercurial repository at the file
 system root and exclude everything except for explicitly added files.
 I'd then run something like hg commit -m `date` from cron every 10
 minutes to record the changes automatically. 

Isn't this ass-backwards? Configuration files shouldn't change suddenly.

My system is to keep all configuration files that I have changed from their
defaults in a revision control system repository. That is where I add and
(after testing) commit changes to those files. I then use an install script to
copy changed files (based on SHA1 checksum) to their correct location in /etc
or /usr/local/etc and run restart commands if necessary. So installation is
always done from the repository to the filesystem. If a change doesn't work I
just check out the last good version of the file(s), re-run the install script
and we're back to normal.

Roland
-- 
R.F.Smith   http://www.xs4all.nl/~rsmith/
[plain text _non-HTML_ PGP/GnuPG encrypted/signed email much appreciated]
pgp: 1A2B 477F 9970 BA3C 2914  B7CE 1277 EFB0 C321 A725 (KeyID: C321A725)


pgpNaMjPLXuPF.pgp
Description: PGP signature