On Friday 19 January 2007 18:18, Jens Kubieziel wrote:
> Hi,
>
> a friend of mine wants to use Gentoo, but has a poor internet
> connection. We are thinking about a convenient way to get packages.
> We thought about redefining $FETCHCOMMAND to something like
> 'FETCHCOMMAND="echo ${URI}
>
> > package.file'. But that (and also other tries) did not work. What
> > is
>
> the best way
> to get a file of download-URLs to feed to wget?
>
> Thanks for any recommendations

FETCHCOMMAND is what portage uses to fetch stuff. Once it has run, 
portage expects $stuff to be there. What you have done is made sure the 
portage won't try to download it at all, so of course it won't work.

I normally run 'emerge -pvf' to get a list of URIs to download, then 
bash it into shape to get a text file listing, and send that to wget -i

Something like:

emerge -pvf world > emerge.lst
cat emerge.lst | cut -f1 -d' ' | sort | uniq > emerge.1.lst
[inspect emerge.1.lst manually and remove cruft]
wget -i emerge.1.lst

This can of course be improved tremendously. It only tries the first URI 
for any given file (because of the cut), and it always attempts to 
download every file for every package to be merged (as I haven't found 
an easy way to get just a list of stuff not in distfiles)

I'm sure you could do a better job with a bit of work, this just happens 
to suit my particular needs

alan

-- 
gentoo-user@gentoo.org mailing list

Reply via email to