On 2003.10.30 05:34 Challison wrote: > Good Morning All, > > ...it is assuming that the administrator would perform a fresh > install and then restore the files to the new existing directories and > overwrite everything. > > My question is twofold: > > What is your opinion of this method, and if you feel that is it lacking > then how would you improve it? > > and > > Are there other vital directories that have been left out that are > needed?
Nothing is really vital, is it? When you have more than one machine running, it's easy to get configuration files you forgot about off the other machine. /etc/X11/XF86Config or XF86Config-4 is the only file that bugs me when I don't have it, so I put them on an ftp server with the machine name attached to it. Windows adds complications to this picture that I don't bother with, but free software can be very useful for people who do. It is much easier to do an install than it is to make backups of system files. This seems strange to people in the Windoze world, where CDs and "original" software have value, and bad things happen in DLL/Registry land, but it's true. The binaries are available and the ones you get from your favorite distro are better than the old ones on a CD. In a corporate environment you would have one or two default images you would keep up. This would have been done with partimage or similar. The real and important information is all in home and etc. Home has everything your users will care about that's not on a central server. For individuals, a fresh install almost always works better. To improve the script, I'd tar things directly rather than make copies and I'd not do /etc and /home at the same time. /etc does not change as often as things in /home do. Also, the copy eliminates time stamp information you want to have in your backup. Tar maintains that information and has the ability to do incremental backups. It's hard to find the file you want when you make too many backups. Over time, backups have become less important to me. I've never lost anything due to a system failure, even when hard drives failed. The last two times hard drives failed on me, I had plenty of time to get the information off them. I've never suffered a system failure due to software. OK, I run boring old Debian stable, but it is as it says it is. Because of this and the ease of setting up a new system, I don't feel compelled to back up system files. I've not had to worry about Windblows because I hardly use the one copy I have and don't let it see the network. I've got free software to get at all my windows based work, so when my last install of windblows dies, it's simply gone. I hardly use my scanner anyway. I've backed up classwork, projects and work files one thing at a time. Things that are really important, like my first year of baby pictures or my mom's 45s get made into CDs that I give other people. New pictures and work files get transfered back and forth between a desktop and a laptop, so multiple copies exist. Window manager settings and that sort of thing are just too easy to tweak out, so I don't bother with .kde and all that. In fact, I kind of like each computer I use to have it's own personality. To backup projects I'm working on, I first back up everything: cd project cd ../ find project -type f > list vi list tar cvf project.tar `cat list` Then, as I feel like it, I make an incremental backup: cd project cd ../ find project -type f -newer list > new_list vi new_list tar cvf project_date.tar `cat new_list` mv new_list list You might have other complications if you use symlinks in your project, but those too can be followed and backed up. You can script this if you have many projects. Use something like: date_stamp=`date -I` echo $date_stamp to get the date into your filename. There are many fine backup scripts available for every conceivable purpose under heaven out there on the web. It might be a nice lecture for the newbie group. It's not on http://groups.yahoo.com/group/cccclinuxsig/files/class_outline.txt so I'll mention it next week.
