Yes, groups like this really are your best bet for learning things...Plus, this is pretty helpful group, which seems to be more helpful, good natured and funny than any other tech group I belong to.
I did the same kind of thing with NT, but with DOS there was no online documentation until much later. So I got in the habbit of buying the Norton stuff (back when Peter Norton Computing was not owned by that lame Symantec...) I had his PC guru handbook, his assembly books, you name it. And of courese I had all of the Norton Utilities tools.
If you consider yourself a beginner, don't be afraid to read the man pages either... That's how I tought myself Unix ( way back when Bell Labs owned it ;-)
Looks like you really don't need much of that kind of thing with all of the tools built-in on a UNIX based system, which is great!
ls, more and grep I've used, both on NT and the various UNIX OSes I've tinkered with. Borland's grep port was very important in the Turbo C world, too.
Seriously... just pick a list of the more common commands... "ls, more, less, ps, find, grep", etc... and read the man pages..once you've mastered that small list, the rest come easy..
I'm going to have to figure out a scheme for that, which is a great idea. Thanks!
As for having to reload when you screw up... Do backups... learn to use tar and/or dump/restore. Set up your system up to use seperate filesystem for '/', '/usr', '/boot', etc.. and back those file systems up into files on /home or /data or /backups, etc ( or a tape drive if you have one).... and there's nothing wrong with having alternate roots and boots... so if you do miss something up, it becomes trivial to 'recover'. I try to regularly copy my filesystem backups to CDs. If you keep your file systems seperate.. / and /usr should easily fit onto a CD...
hahaha.. I can just imagine it. I thought carrying home CDs of endless directory trees full of docs and emails was odd. I guess not!
Also, I have a collection of files that are very dear to me... files that I would not want to lose.. Financial records, source code directories, etc... these get tar'd up, gpg encrypted, and scp'd off to friends computers via the net on a regular basis.. This way, if Nevada were to fall off the map, I could still recover this data. i.e. I'm doing 'offsite' backups. Nothing like having your valuable files backed up to 3 different continents to make you feel secure.
I use the at scheduler in NT all the time. I think it's really similar, so that should be another easier transition. Great idea!
Oh yeah, I almost forgot.. learn cron as well.. nothing like having your backups done automatically in the middle of the night.
Let's see, we have ruby, perl... Isn't there another gem stone I could learn? I like emeralds better... <LOL>
And finally... if you don't already know perl... it's time to learn it.. ( yes, I know, this is not linux specific ). There's just so much you can do with perl that it seems hard to administer a system without perl.
One of the guys I work with likes perl. I haven't taken a close look at it yet. Why, out of curiosity, would one choose perl over std. shell scripts, besides the fact that they hurt the brain trying to read them?
- jim
Great stuff! Thanks... thanks to everyone! -Gary
_______________________________________________ RLUG mailing list [EMAIL PROTECTED] http://www.rlug.org/mailman/listinfo/rlug
