Backups (was: /opt/ again)

1999-09-15 Thread Marc Haber
On Tue, 14 Sep 1999 17:11:07 -0700, you wrote:
Tuesday, September 14, 1999, 4:52:43 PM, Marc wrote:
 IBTD. Backups are to get a crashed system up again _FAST_. And this
 can be accomplished by dropping a single tape in.

There comes a point where one loses more time restoring tape than from
recrating from scratch.

Considering one can install a fairly robust system (FreeBSD, Debian) over
FTP/NFS in under an hour

If a broadband internet connection is available, yes. That doesn't
apply to all sites.

and it takes 2-3 to go through a gig of data

The DDS-3 tape drive we use can backup and verify about 2 Gig per
hour. That compares to the throughput of roughly 4.5 Mbps. Since in
Germany you can either have 2 Mbps or 34 Mbps (the latter costing
about 5 KEuro per Month in line cost without even having it connected
to an ISP). Thus, your approach is rarely applicable over here.

I would
much rather reinstall the programs and retrieve the relatively small data
(/etc, btw, is data).

ack. I'd dispense with backing up /usr iff dpkg could re-install all
packages that are known to be installed from /usr/state with a single
call.

btw, if you do it this way, /usr/local has to be considered data, too.

Greetings
Marc

-- 
-- !! No courtesy copies, please !! -
Marc Haber  |Questions are the | Mailadresse im Header
Karlsruhe, Germany  | Beginning of Wisdom  | Fon: *49 721 966 32 15
Nordisch by Nature  | Lt. Worf, TNG Rightful Heir | Fax: *49 721 966 31 29



Re: Backups (was: /opt/ again)

1999-09-15 Thread Steve Lamb
On Wed, Sep 15, 1999 at 07:39:19AM +, Marc Haber wrote:
 Considering one can install a fairly robust system (FreeBSD, Debian) over
 FTP/NFS in under an hour
 
 If a broadband internet connection is available, yes. That doesn't
 apply to all sites.

Who said anything about an internet connection?  Would you do NFS over the
internet?  In any shop that is considering getting the machine up fast the
quickest way would be to get the bare system on, throw it on the network, and
either mount a drive over NFS or FTP.  At least, that is how we did things in
the shop I was in.

Of course now we just have master disks for everything, we just dupe the
disk and drop the data in off backup.  Takes us ~1/2 hour to replace a machine
and get it back up and running with the previous configuration.

 The DDS-3 tape drive we use can backup and verify about 2 Gig per
 hour. That compares to the throughput of roughly 4.5 Mbps. Since in
 Germany you can either have 2 Mbps or 34 Mbps (the latter costing
 about 5 KEuro per Month in line cost without even having it connected
 to an ISP). Thus, your approach is rarely applicable over here.

2Gb for the programs which are stored on a varity of different media or
2Gb for the data.  I'd rather have 2Gb of data, thanks.

 ack. I'd dispense with backing up /usr iff dpkg could re-install all
 packages that are known to be installed from /usr/state with a single
 call.

File a bug on it, then.  I would like to see that to having lost /usr once
on my server at home.
 
 btw, if you do it this way, /usr/local has to be considered data, too.

Depends on what you have on there.  If it is stuff that is easily replaced
from source, recompile.  I'd backup the sources, not the programs themselves.

-- 
 Steve C. Lamb | I'm your priest, I'm your shrink, I'm your
 ICQ: 5107343  | main connection to the switchboard of souls.
---+-



Re: Backups (was: /opt/ again)

1999-09-15 Thread Anders Arnholm
Steve Lamb wrote:
   Depends on what you have on there.  If it is stuff that is easily replaced
  from source, recompile.  I'd backup the sources, not the programs themselves

It's almost always faster to recreate everything (idenical) from back
that from something else. (I suppost don't have any masinal make
alllocalsoftare ; make install) rule in your source directory.)

/ Balp




Re: Backups (was: /opt/ again)

1999-09-15 Thread Steve Lamb
On Wed, Sep 15, 1999 at 04:04:01PM +0200, Anders Arnholm wrote:
 Steve Lamb wrote:
Depends on what you have on there.  If it is stuff that is easily replaced
   from source, recompile.  I'd backup the sources, not the programs
   themselves
 
 It's almost always faster to recreate everything (idenical) from back
 that from something else. (I suppost don't have any masinal make
 alllocalsoftare ; make install) rule in your source directory.)

That is why I said depends.  /usr/local, being the domain of the local
administrator solely, does not fall well under generalistic rules.  My
personal preference is to not back up what is there because I normally have
the sources for anything I keep archived up and each one is but a make away.
If I had 40-50 different customized things in there, I'd agree.  But I've
never gotten to that point, even on my Slackware systems when I hand updated
everything.  I stress that that is me and complete why I started with
depends in the quoted material.

-- 
 Steve C. Lamb | I'm your priest, I'm your shrink, I'm your
 ICQ: 5107343  | main connection to the switchboard of souls.
---+-