On Mon, 29 Jan 2007 11:50:34 +0200, Alan McKinnon wrote:

> I already use a fairly complicate solution with emerge -pvf and wget in 
> a cron on one of the fileservers, but it's getting cumbersome. And I'd 
> rather not maintain an entire gentoo install on a server simply to act 
> as a proxy. Would I be right in saying that I'd have to keep 
> the "proxy" machine up to date to avoid the inevitable blockers that 
> will happen in short order if I don't?
> 
> I've been looking into kashani's suggestion of http-replicator, this 
> might be a good interim solution till I can come up with something 
> better suited to our needs.

I was suggesting the emerge -uDNf world in combination in
http-replicator. The first request forces http-replicator to download the
files, all other request for those files are then handled locally. So if
you run this on a suitable cross-section of machines overnight,
http-replicator's cache will be primed by the time you stumble
bleary-eyed into the office.

If all your machines run a similar mix of software, say KDE desktops, you
only need to run the cron task on one of them.

I use a slightly different approach here, with an NFS mounted $DISTDIR
for all machines and one of them doing emerge -f world each morning. it's
simpler to set up that http-replicator but is less scalable since you'll
get problems if one machines tries to download a file while another is
partway through downloading it.


-- 
Neil Bothwick

Most software is about as user-friendly as a cornered rat!

Attachment: signature.asc
Description: PGP signature

Reply via email to