Joe Rizzo wrote:
My current thinking is to have a limited portage tree that only contains
ebuilds needed by the systems. Included in this portage tree will be
ebuilds for custom software packages. The systems will sync off of this
custom maintained portage tree. I would like binary packages to be
available from a central repository and not have gcc on the systems. I
would like to avoid mounting a network file system. Am I on the best
path?
Functionally, systems will be imaged the minimal base image via
systemimager. After that, packages will be deployed via emerge. The
packages will be precompiled and available. On going, packages will be
updated and deployed via emerge.
Please provide experience or ideas on:
1) Creating and maintaining a minimal gentoo image?
I've done it in quite a simple way:
1. Make an install of how you would want the base-server to look like.
2. Tar it in a tar-ball called stage4-<arch>.2005.0.tar.bz2
3. Upload to internal web/ftp.
4. Install next servers with stage4 :)
Of couse this isnt very automated, but makes for a quicker install than
anything we've used previously.
2) Managing portage and packages for a large scale gentoo environment?
I administer a smaller but rapidly growing serverfarm of about a dozen
Gentoo boxes. I distribute custom ebuilds with the gensync-tool from
gentoolkit-dev and portage overlay. That way all my own ebuilds resides
in a seperate place. I've been planning to create a second portage
overlay for "stable" packages, and then only let "emerge --sync" update
important stuff like php, apache, mysql and exim. Though I'm not a 100%
convinced with the longevity of such a setup. I've found that cleaning
up broken binary-packages and revdep-rebuild stuff currently is quite a
headache, so something like this must be done.
/Daniel
--
[email protected] mailing list