Steve Shockley wrote:
> Whyzzi wrote:
>> Hi gang. Running a lightweight mail server here (50 users total) on
>> OpenBSD, and being the cheap bastard that I am I am looking forward to
>> scripting a nightly backup onto some DVD-RW media. Can I assume that
>> dump/restore is out of the question because of the special commands
>> burners require to begin the writing process? And if that is indeed
>> the case, any recommendations or uber cool few liners that would have
>> say get maximum compression of the contents in /home where all related
>> mail is stored (sendmail/procmail-maildir/dovecot). BTW: Happy
>> Holidays to you and yours!
> 
> You could dump to a file piped through gzip/bzip2, then copy that to 
> CD/DVD.  I back up several OpenBSD machines at work by dump|bzip2 to an 
> nfs share on Windows (SFU), then the Windows box gets backed up to tape. 
>   (No Commvault agents for Open.)  Works well and doesn't require 
> changing our existing corporate backup process.

If compression is a consideration, I have to mention something I
discovered recently for a similar issue:

If your e-mail flow has a lot of binary attachments (our people are very
fond of e-mailing PDF files out of scanners), gzip is not very good.
bzip2 was hardly any better, but many times slower -- most certainly not
worth the tiny improvement in size.  A "solution" turned out to be rzip,
which was slightly faster than bzip2, but delivers MUCH better
compression numbers, returning better than 2:1 compression most of the
time (and sometimes, a WHOLE LOT more than that), compared to bzip2 and
gzip, which were doing something like 30% and 35% reductions,
respectively (vs. 65% reductions for rzip on the same file).  These are
on big (400+MB) mbox files (I'd suspect tar files would produce similar
savings, but individual files in an Maildir would probably be "different").


A few warnings:
  1) you can not pipe/redirect rzip.  Apparently, it needs
non-sequential seeks on the file to find redundancies.  However, disk
space is cheap, just compress/expand from a local HD partition.
  2) memory, memory, memory: it works by looking over a much larger
"window" than bzip/gzip/etc. use looking for redundancy, so for big
files, a gig of RAM is not very wasteful.  Un-rzipping a file, however,
can be done on very modest machines -- I have the system burn to DVD-R,
then unpack it and verify MD5 numbers, all on a old P4/128M RAM (old
RAMBUS machine, thus it probably never will get another stick of RAM
unless another machine dies).  The actual image is created on a amd64
system with a gig of RAM.
  3) I have been unable to find a Windows port of the rzip application,
which really doesn't matter a whole lot, unless you want to get to your
data out on a Windows system.  rzip is in OpenBSD ports, and apparently
can be compiled easily enough on most Unix-like platforms.  (No, I
really don't care how easily it "could" be ported to Windows, unless you
actually have done it.)

Nick.

Reply via email to