Thanks very much Ben,
Great explaination!

Cheers,
Cameron

On Tue, Sep 13, 2011 at 12:56 PM, Ben Scott <[email protected]> wrote:

> [aggregate reply to multiple messages]
>
> On Tue, Sep 13, 2011 at 9:48 AM, Cameron <[email protected]> wrote:
> > IBM P520 - 2 LPARs - 1 - AIX 5.3, 2 - AIX 6.1
> > Need to backup approx 80GB (roughly 40GB per LPAR)
> >
> > I was using BE2010 to backup (to tape) my Windows boxes as well as the
> Unix
> > clients but I don't *have* to stick with this. I have a spare server
> > (Windows) that has a tape library attached that I could use (gigbit
> backbone
> > for the whole network).
>
>  I would generally prefer to do a disk-to-disk backup to the one
> server, and then backup that one server to tape for offline/offsite
> backups.
>
> > Email of completion status, ease of use for backup/restore.
>
>  Email is super-easy.  Most scheduled things in Unix-land are kicked
> off from cron, and cron automatically emails job output to the job
> owner.
>
>  You will have to be more specific for "ease of use".  In particular,
> note that "easy to use" and "easy to learn" are often inversely
> proportional, and different people find different UIs easy/hard.
>
>  The most common drawbacks with a simple tar involve restores:
>
> * tar is a sequential format.  If the file you want is near the end of
> the archive, it has to read through everything else first to find it.
> It is simple and robust, but slow.
>
> * Cherry-picking files during a restore can be tedious, if you don't
> know the exact name.  You have to list the archive to get the exact
> name, then run it again to extract.
>
> On Tue, Sep 13, 2011 at 10:38 AM, Cameron <[email protected]>
> wrote:
> > It's a combination of Pro Isam/Oracle databases and our ERP application
> and
> > a few other apps...
>
>  If the applications keep their files "hot" all the time, you will
> have to find a way to get said apps to either (1) do an online backup
> to separate files which you can then backup or (2) temporarily quiesce
> and make consistent their files on-disk during the backup.
>
> > Am I reading correctly that with 'tar', if you wanted to exclude
> > directories you have to create a file and within that file list them out?
>
>  It depends on the variant of tar you're using.  I'm not familiar
> with the one that comes with AIX, but GNU tar supports wildcards in
> its exclusions.
>
> > What I wonder is how the performance would be
> > doing it that way. Would it be better performance to tar on the box and
> then
> > copy it to the windows share?
>
>  It will likely be overall faster to tar directly to the target
> system.  That way you're reading from a local disk and writing to the
> network, once each.  Doing it to local disk first would mean twice as
> many read operations (once to create the tar archive, then to copy
> it), and will also cause I/O contention unless you have separate
> spindles (it will be writing to the same disks it's reading from).
>
> > I may have missed reading it, but is there a
> > way to produce a text file listing of all the files that were sucessfully
> > tarred?
>
>  Add the --verbose (-v) switch.
>
>  Here is an example, using GNU tar and GNU date.  AIX variants may
> not support all the GNU features.  However, the GNU variants are
> available for AIX, so if you don't already have them, get them.  :)
>
>        TODAY=$(date --iso-8601)
>        tar --create --gzip --verbose --totals \
>                --file=/mnt/backupserver/${HOSTNAME}_${TODAY}.tar.gz \
>                --files-from=/etc/backup/include \
>                --exclude-from=/etc/backup/exclude \
>                > /mnt/backupserver/${HOSTNAME}_${TODAY}.log
>
>  The first command just saves the date, in YYYY-MM-DD format.  The
> second command does the backup.  The option switches are:
>
>        --create        create archive (as opposed to --list, --extract,
> --diff, etc.)
>        --gzip  compress with GNU gzip (if you have more I/O than you do
> CPU,
> omit this for speed)
>        --verbose       list file paths as they are written (use twice to
> get file details)
>        --totals        print total bytes written and performance at end
>
>  The last part of the command rewrites output to a file, so you get a
> log with the file list.  Significant messages (errors, etc.) will
> still print to standard error, so you'll get those on the console or
> in the cron job email, without being flooded with every single file
> backed up.
>
>  The /etc/backup/include file could look like:
>
>                /etc/
>                /home/
>                /usr/local/
>
>  The /etc/backup/exclude file could look like:
>
>                /usr/local/tmp/
>                *.mp3
>                *~
>
>  You get the idea.
>
> -- Ben
>
> ~ Finally, powerful endpoint security that ISN'T a resource hog! ~
> ~ <http://www.sunbeltsoftware.com/Business/VIPRE-Enterprise/>  ~
>
> ---
> To manage subscriptions click here:
> http://lyris.sunbelt-software.com/read/my_forums/
> or send an email to [email protected]
> with the body: unsubscribe ntsysadmin
>
>

~ Finally, powerful endpoint security that ISN'T a resource hog! ~
~ <http://www.sunbeltsoftware.com/Business/VIPRE-Enterprise/>  ~

---
To manage subscriptions click here: 
http://lyris.sunbelt-software.com/read/my_forums/
or send an email to [email protected]
with the body: unsubscribe ntsysadmin

Reply via email to