Said michael stone on Tue, Nov 20, 2001 at 10:17:31PM -0600, > We are talking about setting up a new linux lab and of course the question > comes up: Which distro?
I maintain a Linux lab in ENS. It's a Debian lab, so I don't know how relavent this will be, but FWIW... > Stability > Security > Ease of administrator stuff (we have a minimal staff) I'd say that these have been the shining featuers of Debian potato. The lab machines basically never crash or malfunction. (More often it's me who takes them down to use a new kernel, etc.) Also, I can incorporate the Debian security system, which upgrades security-related patches daily. Finally, the Debian package tool incorporates downloading and installing the package in one step, so I needn't bother with that. That makes it nice to install a new package on many machines with a one-line shell script. I've found this invaluable. > Ease of user stuff > Support > Package availability > A lack of annoying bugs Our systems are Debian Potato (stable) with kernel 2.4, reiserfs, Gnome 1.4, KDE 2.2, plus any other goodies students want. But the cost is that I've done a lot of customization to the stock Debian system, since Potato was released a long time ago, and the unstable branch is not going to be released any time soon. Generally this means adding one more line to the package list and installing away. (Mostly, all the cool stuff from unstable gets backported to the stable distribution.) But it's definitely not a standard Debian distibution anymore. However, the system is pretty sweet, IMHO. However, and it really breaks my heart to say this, I wouldn't suggest the Debian route at the time. When we implemented the lab, Debian stable was definitely the best system, but it's pretty aged right now and it looks like the other distributions are catching up to its security. (We haven't had one breakin, which is much better than our Solaris stations. And once we installed RH 6.2 and it was compromised in 12 hours.) But now the distribution is aged and requires much TLC to look and feel hip like the others. The Debian team has a very slow distribution release cycle. This is supposed to get better in the future; and then I would wholeheartedly suggest Debian, as the support and quality have been excellent. Some random thoughts about a Linux lab: Have as many browsers as you can. It seems that these days, no one Linux browser can handle the entire web. Students still complain to me that Mozilla 0.9.5 crashes, and Opera and Konquerer have given me annoying problems with some pages, particularly integrating java, flash, realaudio, etc. etc. And Netscape 4 often crashes in a tight loop, eating all your CPU; but we still offer it because the UT registrar doesn't allow mozilla. I had to write a script that searches for crashed netscape and kills it, called by cron. In our lab, I had to compete against NT for mindshare. I had to make the Linux stations very sexy to get lots of users, so my boss would feel that the investment was worth it. In my opinion, Linux (or some Unix) is absolutely essential in any academic environment. This may not be an issue for you. If you're going to be administering, get very used to ssh and ssh-agent. They will save you a lot of time. Check out keychain, too. Set up strict firewall rules and read the logs. This improves security, but it also helps to detect other network trouble. Once we discovered a router malfunction because an NT station was broadcasting NetBIOS to the Unix subnet. The first thing I did was write some Perl to print the log in human-readable format; and this gets mailed to me every night via cron. Speaking of logs, you should strongly consider a separate server, just for keeping logs. Linux is a high-profile target, with many exploits available. And if a machine gets compromized, then you can't trust its logs. So make a dedicated server for logs and lock it down tight. If you want to install any custom software, you may wish to set up an NFS server, and have the workstations mount it on /usr/local. Then you can compile and install once, and it's automatically on your machines. Also, always compile software with a prefix ("./configure --prefix=/usr/local/software/fooprog-1.2.3"), and learn how to use stow. This will save you lots of time and headaches. Get at least two computers for yourself. Use one as a workstation, and one as a lab prototype. Test all your changes on this machine, and don't push them to the lab until you are certain that the changes work. In our lab, I wrote some software that will netboot a machine, partition, and pull a software image over the network, install it, configure the IP, hostname, etc., and reboot to be a working station. That way, I had to set up one machine, and then pushing it to the lab took less than one hour. And all the machines are surely identical. Get familiar with Perl and shell scripts if you aren't already. A strong grasp of Perl has saved me countless hours of work. Well, that's about all I can think of at the moment. Hope that helps! P.S. Are you the guys in ME with the old AIX servers? -- fingerprint: 740F B8D9 DF20 362C F5BC CDE6 6923 5A48 7657 541F lynx -source http://www.ece.utexas.edu/~jhs/public_key.gpg | gpg --import
msg00028/pgp00000.pgp
Description: PGP signature