Hi,

My apologies if members don't consider this list appropriate, if so perhaps
a more appropriate forum could be suggested.

I am using scientific linux 6.2. I will be installing it on 4-5 computers
that will make up my lab. I plan to add some additional packages from
source or otherwise:

/opt/python2.7
/opt/python3.2
/opt/libreoffice3.5
/opt/zotero3.0
/opt/qt-4.8.1

...and a few more plus a number of python2.7 packages such as numpy, pyqt,
etc. I have my environment setup pretty well on one machine with a bash
script I wrote to download make and install all the packages I need. It
also sets up a decent /etc/skel/, so that global settings are set for all
users. I would like all the machines to be identical, and I would like to
be able to update packages automatically. My question is, what is the best
way to deploy and maintain this environment onto several networked
machines? Setting up my own repo seems like a lot of overhead for a few
machines. Is there a better way? I could go around to each machine with my
bash script, but what would the best way be to handle updates? I am willing
to learn about repository management if necessary, but obviously the less
effort/knowledge required the better, as I am a relative novice at system
administration. Thus, any suggestions will be appreciated.

Thanks, Chris

Reply via email to