Re: [gentoo-user] making a file-list at (a) for fetching at (b)

2009-11-14 Thread Alan McKinnon
On Saturday 14 November 2009 20:55:16 Maxim Wexler wrote:
> > redirect to a file, bash it into suitable shape with your Unix text tools
> > of course, use said file as input to wget.
> >
> >
> > --
> > alan dot mckinnon at gmail dot com
> 
> Here
> 
> http://www.gentoo-wiki.info/TIP_Gentoo_for_dialup_users
> 
> I found this gem:
> 
> emerge -fpu world | sort | uniq | sed '/\(^http\|^ftp\).*/!d;s/\
> .*$//g' > links.txt
> 
> But something doesn't seem right. links.txt has 92 lines(I added the
> ND switches) that all use only one URL, distfiles.gentoo.org, for each
> package. It's 5.5k. But the raw command lists several URLs for each
> package and it's gotta be ~200k. And if you read the article the wget
> command is meant to skip the other URLs as soon as one instance of the
> pkg has been downloaded:
> 
> "With wget, just do:
> 
> wget -i links.txt -nc
> 
> Option -i tells wget to look inside links.txt for URLs of stuff to
> download, option -nc tells it not to download it twice or thrice once
> the file has been retrieved from a working URL."
> 
> Am I missing something here?

The output of emerge -f lists ALL known mirrors and SRCs configured on the 
machine for each distfile. So you really only need to grab the first one 
listed, which is usually gentoo.org. If the file is not there, it is most 
unlikely to be anywhere else, with the exception of fetch-restricted packages. 
Those you would have to download manually anyway.

When I was forced to use this same method, I just used cut on the output, 
using space as the delimiter. This seldom failed.

I would not trust wget -nc. It has all kinds of implications including what to 
do with differing timestamps. wget -c is better. It completes partial 
downloads.

-- 
alan dot mckinnon at gmail dot com



Re: [gentoo-user] making a file-list at (a) for fetching at (b)

2009-11-14 Thread Maxim Wexler
> redirect to a file, bash it into suitable shape with your Unix text tools of
> course, use said file as input to wget.
>
>
> --
> alan dot mckinnon at gmail dot com
>
>

Here

http://www.gentoo-wiki.info/TIP_Gentoo_for_dialup_users

I found this gem:

emerge -fpu world | sort | uniq | sed '/\(^http\|^ftp\).*/!d;s/\
.*$//g' > links.txt

But something doesn't seem right. links.txt has 92 lines(I added the
ND switches) that all use only one URL, distfiles.gentoo.org, for each
package. It's 5.5k. But the raw command lists several URLs for each
package and it's gotta be ~200k. And if you read the article the wget
command is meant to skip the other URLs as soon as one instance of the
pkg has been downloaded:

"With wget, just do:

wget -i links.txt -nc

Option -i tells wget to look inside links.txt for URLs of stuff to
download, option -nc tells it not to download it twice or thrice once
the file has been retrieved from a working URL."

Am I missing something here?



Re: [gentoo-user] making a file-list at (a) for fetching at (b)

2009-11-13 Thread Alan McKinnon
On Friday 13 November 2009 22:24:34 Maxim Wexler wrote:
> Hi group,
> 
> Can someone explain to me how to generate a list of files to be
> fetched, eg -fuDN world, on a slow desktop that can be downloaded onto
> a netbook later?


emerge -pf  

redirect to a file, bash it into suitable shape with your Unix text tools of 
course, use said file as input to wget.


-- 
alan dot mckinnon at gmail dot com



Re: [gentoo-user] making a file-list at (a) for fetching at (b)

2009-11-13 Thread Jesús Guerrero
On Fri, 13 Nov 2009 15:36:27 -0500, Marcus Wanner  wrote:
> On 11/13/2009 3:24 PM, Maxim Wexler wrote:
>> Hi group,
>>
>> Can someone explain to me how to generate a list of files to be
>> fetched, eg -fuDN world, on a slow desktop that can be downloaded onto
>> a netbook later?
>>
>> Maxim
>>   
> You want to generate a list of packages to be upgraded, and then upgrade

> them on different computer?

He probably wants to update a computer that's got not internet connection.
Hence, he will need to get a list of the URI's to download in (a), then
fetch them from another computer (b), then put all the files on the
$DISTDIR of the (a) computer. Emerge will pick them from there instead of
looking for a way to download them.

-- 
Jesús Guerrero



Re: [gentoo-user] making a file-list at (a) for fetching at (b)

2009-11-13 Thread Jesús Guerrero
On Fri, 13 Nov 2009 13:24:34 -0700, Maxim Wexler 
wrote:
> Hi group,
> 
> Can someone explain to me how to generate a list of files to be
> fetched, eg -fuDN world, on a slow desktop that can be downloaded onto
> a netbook later?

Add -p or --pretend, and redirect the output to a file.

  emerge -fuDNp world > list.txt
-- 
Jesús Guerrero



Re: [gentoo-user] making a file-list at (a) for fetching at (b)

2009-11-13 Thread Marcus Wanner

On 11/13/2009 3:24 PM, Maxim Wexler wrote:

Hi group,

Can someone explain to me how to generate a list of files to be
fetched, eg -fuDN world, on a slow desktop that can be downloaded onto
a netbook later?

Maxim
  
You want to generate a list of packages to be upgraded, and then upgrade 
them on different computer?


Marcus



[gentoo-user] making a file-list at (a) for fetching at (b)

2009-11-13 Thread Maxim Wexler
Hi group,

Can someone explain to me how to generate a list of files to be
fetched, eg -fuDN world, on a slow desktop that can be downloaded onto
a netbook later?

Maxim