Grant wrote:

Each line in your links.txt is a list of different mirror urls for the
same package separated by '%20'.
This should take the first link from every line and pass it to wget:
cat links.txt | sed -e 's/%20.*//' | xargs -n 1 wget
or alternatively:
sed -e 's/%20.*//' <links.txt >links1.txt
wget -i links1.txt


Thank you, that seems to be working great.  Should I update the wiki?
It says I should do this on the networkless machine:

emerge -fp package1 package2 2> links.txt

and this on the networked machine:

wget -i links.txt

and that's what I did.

- Grant

Actually that first line of code ends up trying to download the same file over and over. I think there are a bunch of different paths specified for each file so it can always find one that works. I think it is trying to download each one of those.

The alternate solution ends up in the same situation as before with a
bunch of bad paths.

- Grant
-- mailing list

Ok, I actually went and tested this out this time. The %20 are from wget escaping spaces, they are not in the file so you could do
cat links.txt | sort | uniq | sed -e 's/ .*//' | xargs -n 1 wget -c

the sort | uniq pipeline removes identical lines and wget -c option makes sure that if there is a doubleup, the files dont get overwritten.
Its a hack but it should work. It relies on the first mirror for each package being valid. If its not, it will not get that package since sed prunes all but the first link. So if one of the packages doesnt download, you'll have to get one of the links for it manually and download it.
Alternatively you could try:

cat links.txt | sort | uniq | xargs -n 1 -i{} sh -c 'for i in {}; do wget -c $i && break; done'

this command will actually iterate over the alternative links for each file until it one of them works. It's ot perfect but works reasonably well for me. Use this one and if it breaks for you, use the first, less complicated line.


PS and I dont see how the line in the wiki could have worked well unless emerge -fp used to have different behavior.

-- mailing list

Reply via email to