Re: No-brainer backup from Linux to space on remote drive?

2012-02-15 Thread Jerry Feldman
On 02/14/2012 05:55 PM, Ralph A. Mack wrote:

 Thanks, all,

 I looked through the suggestions. Remote backup turned out to be
 something different than what I had in mind. BackupPc is expected to
 sit on a centralized server and so, it would seem, is rsnapshot, or at
 least rsnapshot is using Linux file system properties to optimize storage.

 I try to run my domestic LAN on a self-service basis so I don't have
 to come home from a day of programming to be the domestic systems
 admin, i.e. the bottleneck. I'm providing a NAS drive with private
 areas for the individuals in the house. My notion is that they can use
 any backup tool they like locally on their systems to push their data
 onto the provided NAS area. As long as the NAS drive doesn't become
 inaccessible, it doesn't become my problem. :) Of course, if they ask,
 I can suggest tools they might want to learn about and use. This is
 very different from an office, where its somebody's job to do this stuff.

 So they've got three Windows machines between them to worry about.
 I've got a handful of boxes including two or three running Linux. For
 each Linux box, I'm just looking for a daemon that runs as a service
 that does periodic incremental backups of user data and system
 configuration behind the scenes, pushing the bits to a NAS drive and
 using the NAS storage area to keep track of where it is in the backup
 cycle. If it saves enough so I can reconstruct the system more or less
 as it was if the hard drive crashes, I'm happy. 

 If backup (or any act of maintenance) is something I need to remember
 to do, it will never happen. If it's something I can set up once and
 then forget about for a few years, that'll work. I know that's not the
 attitude of an IT professional, but home is where I come to leave my
 profession behind for a few hours and use my computers to make art and
 music and stories and write essays and plan the revolution :), using
 open source tools wherever I can.

 Can I get rsnapshot to do the kind of thing I'm talking about without
 writing a lot of additional scripting, or is there a better tool for
 this kind of operation?


I use rsnapshot at home and at work. While it is essentially a perl
script around rsync, it generally works well. Duplicate files are hard
linked. So that you can keep as many backups as you want. Each backup is
essentially an incremental backup, but the resulting directory is full.
If you pick up a corrupted file, it may be corrupted in one or more of
the backups, but not in some of the older ones. At work, we also get
backed up by our New York office who uses rsync. What I really like
about rsnapshot is that recovery is easy. Recently I was unable to do an
upgrade install to Fedora 16, so I simply rebuilt from scratch and
simply copied all my files back to my home directory, and I also back up
/etc for some configurational stuff.

-- 
Jerry Feldman g...@blu.org
Boston Linux and Unix
PGP key id:3BC1EB90 
PGP Key fingerprint: 49E2 C52A FC5A A31F 8D66  C0AF 7CEA 30FC 3BC1 EB90




signature.asc
Description: OpenPGP digital signature
___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: No-brainer backup from Linux to space on remote drive?

2012-02-15 Thread Ralph A. Mack
Thanks folks,

As usual, one size does not fit all so thanks for the variety of answers. 

I'll use DejaDup for my laptop, as Stephen mentioned. I deliberately keep it 
from having a lot of system configuration to duplicate on my day-to-day 
systems. I should be able to rebuild it from a raw OS reinstall pretty 
trivially. So DejaDup will keep all my home areas and the contents of my data 
drive safe.

I'm liking this rdiff-backup arrangement for my DNS server. Thanks, Lloyd. 
There, everything is about system configuration and recovery looks like - 
switch everybody else's systems to do DNS from the router temporarily and get 
the bugger back up fast. 

I set up my DNS server so I can maintain static address allocations and systems 
can find each other by their assigned host name. Its an authoritative server 
for local addresses, behind the router, invisible to the external world. It 
points to the router to find outside addresses. (grumpWhy doesn't my router's 
DNS config support this feature?/grump) Is there any reason I shouldn't make 
the NAS drive just get its DNS from the router rather than the DNS server 
everything else is using? It seems sensible not to make its operation dependent 
on any system whose data it stores. I think it's just using DNS to get to the 
time server and the site from which it finds out about updates. The router 
should be fine for that. Does a Samba server care about DNS naming for its 
clients? I didn't think so.

Ralph

On Feb 14, 2012, at 18:06, Lloyd Kvam wrote:

 On Tue, 2012-02-14 at 15:16 -0500, Ralph A. Mack wrote:
 Backup for me is a practical necessity rather than a life project, so
 I want something that just works, errs on the side of caution, doesn't
 require continuing attention and maintenance, etc.
 
 I use rdiff-backup.  The current files are in place on the backup along
 with a change history (rdiff's).  You'll need to resort to the
 rdiff-backup command line if you want to use the rdiff history to get an
 ancient version, however the current version can simply be copied.
 
 This is what I use to backup my laptop:
 
 # ionice -c 3 means idle -- do io when system is otherwise idle
 
 ionice -c 3 rdiff-backup --exclude-other-filesystems
 --exclude-special-files --exclude=**/tmp
 --exclude=**/var/tmp / /media/backup/venix-laptop
 
 My backup drive is mounted locally, but rdiff-backup will use ssh to
 backup over your network connection.  Change /media/backup to something
 like root@backup-server:/backup-dir
 
 I've convinced myself that ionice actually speeds up the backup by
 avoiding conflicts accessing the drive.  I did not make any careful
 measurements to back that up.
 
 -- 
 Lloyd Kvam
 Venix Corp
 DLSLUG/GNHLUG library
 http://dlslug.org/library.html
 http://www.librarything.com/catalog/dlslug
 http://www.librarything.com/catalog/dlslugsort=stamp
 http://www.librarything.com/rss/recent/dlslug
 


___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: No-brainer backup from Linux to space on remote drive?

2012-02-15 Thread Alan Johnson
On Tue, Feb 14, 2012 at 6:48 PM, Stephen Ryan step...@sryanfamily.infowrote:

 On Tue 14 Feb 2012 05:06:24 PM EST, Alan Johnson wrote:
  On Tue, Feb 14, 2012 at 4:47 PM, Stephen Ryan
  step...@sryanfamily.info mailto:step...@sryanfamily.info wrote:
  Deja Dup is the default backup app in Ubuntu 11.10; it was very
  easy to
  get set up and it's very unobtrusive during normal usage.
 
 
 
  That's a client side app that would have to be configured on each
  client, right?  Backuppc will certainly take a little fiddling for
  each machine, but not much more than a client side backup program will
  take for each machine and it will handle backups for all the mentioned
  systems in one web interface running on one server, so long as the
  files you want to backup are made available over the network.

 Yup it is; I was assuming, though, that with a WD network drive, he'd
 have an easier time of setting up client-side backups than trying to
 persuade the WD network drive to install something centralized.  My
 assumption might very well be wrong...


Any network drive could be used as the backup storage by any other machine
running the backup service, but with the additional clarifications, your
guess was more useful than mine anyway. =)  I'd go this route too if all I
was doing was a single client and don't care if other people's backups
fail.  I'm the sysadm in my house whether I like it or not.  If some one
isn't doing proper backups, I'm the one stuck with the job of data recovery
anyway which is way more work, headache, and heartache, than keeping a
backup server running.  =)
___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: No-brainer backup from Linux to space on remote drive?

2012-02-15 Thread Alan Johnson
On Tue, Feb 14, 2012 at 5:55 PM, Ralph A. Mack ralphm...@comcast.netwrote:

 If backup (or any act of maintenance) is something I need to remember to
 do, it will never happen. If it's something I can set up once and then
 forget about for a few years, that'll work. I know that's not the attitude
 of an IT professional, but home is where I come to leave my profession
 behind for a few hours and use my computers to make art and music and
 stories and write essays and plan the revolution :), using open source
 tools wherever I can.


With the other clarifications in this thread, I think you are on the right
path for your goals.  However, unless I am missing your hyperbole, you are
dooming yourself if you plan to forget about your backups for a few years.
At the very least, you need to check it once in a while to make sure it is
still running as expected, no matter what solutions you go with.  The
least-effort, safest solution is probably to have something email you when
it runs with short but sufficient output to confirm the backup ran
completely as expected.  That way you can be sure things are happy with a
glance at an email and a quick delete, and the more frequent you get them,
the sooner you are likely to notice if they stop coming.  I'd go with daily
myself, but you'll have to figure out what's right for you.  Don't relay on
emails only in the case of failure because you email system could fail just
as well.
___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: No-brainer backup from Linux to space on remote drive?

2012-02-15 Thread Ralph A. Mack
On Feb 15, 2012, at 14:25, Alan Johnson wrote:

 On Tue, Feb 14, 2012 at 5:55 PM, Ralph A. Mack ralphm...@comcast.net wrote:
 If backup (or any act of maintenance) is something I need to remember to do, 
 it will never happen. If it's something I can set up once and then forget 
 about for a few years, that'll work...
 
 With the other clarifications in this thread, I think you are on the right 
 path for your goals.  However, unless I am missing your hyperbole, you are 
 dooming yourself if you plan to forget about your backups for a few years.  
 At the very least, you need to check it once in a while to make sure it is 
 still running as expected, no matter what solutions you go with.  

Yeah, I probably didn't say exactly what I meant, just what I wished I could 
mean. :) The key thing is that I can be reactive at need rather than proactive. 
Email is a good tool to tell me I'd better take a look. That'll work. Like 
several years ago when my son was playing by himself in the other room and then 
things got a little _too_ quiet and I had to go see what he was up to. :) 

Since I'm setting up daily backup and I generally log myself out, I can have my 
systems tell me the date of the last successful backup when I log in, too. If 
the backups go down and I haven't logged in since then, I haven't been 
generating any new data to back up. I can fix them before I start working again 
and be ok. That's probably the lowest life-impact solution, but it'll take a 
little more work to set up, so I'll probably go the email route even though it 
means a bit more daily spew from several systems to glance at and toss out.

One characteristic of all us tech folks, I think - we'll put an amazing amount 
of effort into all sorts of Rube Goldberg devices to afford us the sheer luxury 
of being magnificently lazy. :) Here I'm protesting that I won't put a lot of 
effort into setting up backups but I'm already thinking about what I'd do for a 
shell script to scrape the logs, determine success or failure, and the flash it 
up on the screen at login. I think it's a form of madness.

Thanks,
Ralph

___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: No-brainer backup from Linux to space on remote drive?

2012-02-15 Thread Alan Johnson
On Wed, Feb 15, 2012 at 3:11 PM, Ralph A. Mack ralphm...@comcast.netwrote:

 Yeah, I probably didn't say exactly what I meant, just what I wished I
 could mean. :) The key thing is that I can be reactive at need rather than
 proactive. Email is a good tool to tell me I'd better take a look. That'll
 work. Like several years ago when my son was playing by himself in the
 other room and then things got a little _too_ quiet and I had to go see
 what he was up to. :)


haha!  My kids are just getting past that age: they are often quite, but it
still doesn't feel right. =)


 Since I'm setting up daily backup and I generally log myself out, I can
 have my systems tell me the date of the last successful backup when I log
 in, too. If the backups go down and I haven't logged in since then, I
 haven't been generating any new data to back up. I can fix them before I
 start working again and be ok. That's probably the lowest life-impact
 solution, but it'll take a little more work to set up, so I'll probably go
 the email route even though it means a bit more daily spew from several
 systems to glance at and toss out.


I love the on-login idea!  Just add it to your startup applications (on
gnome or unity, but did around in the menus).  A command like `tail -n #
logfile | gedit` should do the trick.  More reliable than email too, in
this case.



 One characteristic of all us tech folks, I think - we'll put an amazing
 amount of effort into all sorts of Rube Goldberg devices to afford us the
 sheer luxury of being magnificently lazy. :) Here I'm protesting that I
 won't put a lot of effort into setting up backups but I'm already thinking
 about what I'd do for a shell script to scrape the logs, determine success
 or failure, and the flash it up on the screen at login. I think it's a form
 of madness.


By that logic, it could be argued that the entirety of computing is just a
series of Rube Goldberg machines. =)  Just look at any source code.  Think
about what it takes to turn even a basic hello-world program into a pattern
photons that hit your eyes just right.  I think it is just the nature of
building off of other people's work.  I like to think of it as standing on
the shoulders of giants.

I once read that a good system administrator is one who automates tedious
tasks before they become tedious.  This implies a perfect amount of lazy. =)
___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/