On 23 Sep 2003, at 7:18 pm, Michael Schreckenbauer wrote:
The `emerge --update` is done separately - no-one sensible would advise
system updates without manual intervention. However if all the files
are stored on one machine & exported over NFS, then it is expedient to
have that machine do the fetching of all files. A cron job to `emerge
sync && emerge -fud world` does NO installation or upgrading of any
systems - it only updates the local portage database & fetches the
updates you require. It would, I would think, be quite easy to unshare
the NFS export before getting the files, and reshare it afterwards, as
part of the cron job. If this is done at 4am, then it is likely to
cause little interruption to service in most environments.
Besides I share your opinion, that shell-scripting is an adequate way of
cleaning distfiles, I must admit, that a cron-job, which calls emerge sync &&
emerge -fud world only makes sense on a shared .../distfiles, if and only if
all machines sharing this, have the same packages installed. Given I have a
server w/o X and such and some desktop machines naturally with this things.
The share resides on the server, because this is the only machine 24/7 up.
How could an update of X, KDE etc happen and use already downloaded files?
When the desktops call emerge -ud world parallel, wouldn't that cause
problems? Am I missing something? I'd love to do it this way, but I found no
workaround for this problem yet.
You are right - in this scenario, if 2 desktops `emerge -u world` simultaneously to the shared directory then they will collide on fetches of new files not in the world file of the server. So in this instance something more sophisticated is required.
My personal approach would be, then, to hold copies of the "world" file for each desktop machine on the master server machine - something like /var/cache/edb/world then /var/cache/edb/world.machine2 /var/cache/edb/world.machine3 & so on. A cron job (on each desktop) backs these up at a suitable time each day before the desktop users leave for the evening & switch their machines down (you could even add it to the shutdown sequence, I guess).
Now, I'm no Bash guru, so I have to recommend the Advanced Bash Scripting Guide, which is a free download from the LDP; chapter 10 covers lists, and examples 10-3 & 10-4 suggest that one could then proceed something like:
#!/bin/bash /usr/bin/emerge sync ; for application in `cat /var/cache/edb/world*` do /usr/bin/emerge -uf $application done
I guess there might be a bug in the way this script handles `quotes` and substitution of * on the 2nd line, but I'm sure one could fix this - hopefully you get the point. One could also concatenate all the files into a pipe before uniquely `sort`ing them, so that `emerge -f` isn't called repeatedly for files which have already been downloaded. But these are details. I think this would, loosely speaking, work in such a way that the master server downloads all the files that will be later be emerged on the desktops.
Hope this makes sense,
Stroller.
-- [EMAIL PROTECTED] mailing list
