Actually it would be a nice and productive discussion if the thread does not degenerate into a "i have more LTSP based terminals than you do" kind of discussion. There are many issues affecting the deployment of LTSP-type solutions.
The most important of which is user acceptance. I know a lot of you abhore the corpulence of KDE and GNOME, but hey, sometimes it takes a desktop that resembles Windows in both functionality and beauty for a user to even consider moving over to an LTSP desktop. So that means, for a barely acceptable system, you need either GNOME or KDE, Mozilla (or firefox + thunderbird), Openoffice, and most probably GAIM to be minimally productive. I'm not even mentioning other stuff like Wine, Xine or Mplayer, Dia, or PDF readers... That takes a lot of resources. Hence, my recommendation of 10 users for a 1GB 3Ghz server. Secondly, there are certain problems associated with deploying LTSP solutions: - for one reason or another, there are cases when the remote session disconnects from the server (power failure? network problems?) So the desktop is rebooted, but alas, the programs the user was running are still running on the server. Now the sysad has to kill each program one by one... There is no 'watchdog' program that takes care of this yet, at least not to my knowledge. GNOME is also notorious for leaving behind some programs even though the user has already logged out. Some say this is may be intentional, I say it's a bug. - network sound works but is not cleanly implemented. KDE/arts does not support it and as Tiger said before, locks the CPU into a loop that slows everybody else down. I know that's a KDE problem, NOT LTSPs, but you don't tell that to the person who hired you right? - in the course of running LTSP based systems, it may be that one user, possibly not intentionally, can take up *a lot* of resources, both CPU and memory, thus slowing everyone else down. Sometimes, untrained users (or those who are truly impatient) contribute to this by repeatedly clicking an icon, thereby launching multiple instances of a program - bringing the entire network to a crawl.There should be a way to limit the amount of resources a user can consume, so that there will be no possibility of an accidental (or intentional) DoS. - bandwidth. 100mbps at the very least. If you've got a lot of clients, the server should be on GbE attached to a GbE uplink or on it's own physical network for it not to interfere with "normal" network trafic. I know the nature of thin clients, and that everything is on the wire, but for some cases, the financial savings gained from not buying new PCs is offsetted by the logistical and physical problems resulting from re cabling/re-engineering of the network backbone to support the minimum requirements. Has anyone run into these problems before? What were your solutions/trade offs if you had any? -- Philippine Linux Users' Group (PLUG) Mailing List plug@lists.q-linux.com (#PLUG @ irc.free.net.ph) Official Website: http://plug.linux.org.ph Searchable Archives: http://marc.free.net.ph . To leave, go to http://lists.q-linux.com/mailman/listinfo/plug . Are you a Linux newbie? To join the newbie list, go to http://lists.q-linux.com/mailman/listinfo/ph-linux-newbie