On Tue, 20 Mar 2012, Miles Fidelman wrote:

[email protected] wrote:

Pretty much ok on 1-3, 4 and 5 are what I'm focusing on right now. Right now, most of my build and config takes the form of fairly simple checklists, containing steps like: - apt-get install <foo> (or: download; untar; ./configure, ./make, ./make test, ./make install)
- look up some config info (e.g., pull an unused IP address off a checklist)
- add a domain record to a zone file
- edit one or more config files
- ,/init.d/<foo> start

other than editing config config files, pretty much everything consists of one-line shell commands - easy enough to stick into a bash script; and I guess a lot of the configuration could be done by adding sed commands to the script

thinking more and more that rundeck (or something like it) would make it easy to manage and execute those scripts (the other thing that looks pretty interesting is a little project called sm-framework (https://sm.beginrescueend.com/presentations/SM-Framework-Presentation.pdf) - essentially adds some glue to the zsh and treats scripts like plug-ins

One problem with just automating apt-get install <foo> is that you don't know what version of <foo> will be installed. It will be the latest one in the repository you are pointing at (by default the upstream project). In development this is fine, but in production this can be a problem.

You can also get into a situation where you have tested with library foo.1, but library foo.2 is now out. you want to have foo.2 installed in dev/qa for testing, but if a production machine dies and needs to be re-imaged, you want to install version foo.1

Or one problem I've seen, version foo.2 comes out, you test in QA. Then maintinance night rolls around and you do apt-get update in production and in installs version foo.3 that was released 5 minutes ago and has a bug in it that kills your system.

to avoid these sorts of problems, you really don't want your production systems updating themselves from the upstream repositories. You need to have some process to download the packages, store them locally, and install just the version that you want.

This can be via some tool like puppet/chef/cfengine, or you can run your own repositories locally (one each for dev/qa/prod) with some sort of process to graduate packages from one repository to the next as testing completes.

David Lang
_______________________________________________
Tech mailing list
[email protected]
https://lists.lopsa.org/cgi-bin/mailman/listinfo/tech
This list provided by the League of Professional System Administrators
http://lopsa.org/

Reply via email to