Re: apt-torrent (WAS: Re: apt PARALLELISM)

2006-01-09 Thread Nathanael Nerode
 It'll take me some time to find a new, and more appropriate home for
 apt-torrent.

The Debian archive (experimental distribution) would  be a *very* 
appropriate home.

It won't provide a testbed package seeder or place to download .torrent files, 
but that can be done later (and by any number of different people, actually).


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt-torrent (WAS: Re: apt PARALLELISM)

2006-01-09 Thread Adam Heath
On Mon, 9 Jan 2006, Arnaud Kyheng wrote:


 Hello all and Happy New Year,


 Thanks to George, apt-torrent has been mentioned in the Debian Devel
 list :o)

 I've just noticed it, and the fun part of this discovery, is that I also
 found why my ISP has closed sianka.free.fr: Too much hits since the
 latest Debian Weekly News, and the new apt-torrent 0.3.1-1 package !

 I apologize, but, victim of its success, the apt-torrent homepage is
 down, and so is, its repository.

 It'll take me some time to find a new, and more appropriate home for
 apt-torrent.

What stats are needed?  Brainfood is offering.


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



apt-torrent (WAS: Re: apt PARALLELISM)

2006-01-08 Thread Arnaud Kyheng

Hello all and Happy New Year,


Thanks to George, apt-torrent has been mentioned in the Debian Devel
list :o)

I've just noticed it, and the fun part of this discovery, is that I also
found why my ISP has closed sianka.free.fr: Too much hits since the
latest Debian Weekly News, and the new apt-torrent 0.3.1-1 package !

I apologize, but, victim of its success, the apt-torrent homepage is
down, and so is, its repository.

It'll take me some time to find a new, and more appropriate home for
apt-torrent.


Arnaud


George Danchev wrote:
apt-torrent seems to approach that too:
http://sianka.free.fr/documentation.html



-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt-torrent (WAS: Re: apt PARALLELISM)

2006-01-08 Thread Christian Perrier
 I've just noticed it, and the fun part of this discovery, is that I also
 found why my ISP has closed sianka.free.fr: Too much hits since the
 latest Debian Weekly News, and the new apt-torrent 0.3.1-1 package !


The solution is simple: get it in the Debian archive..:)



-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2006-01-06 Thread Michelle Konzack
Am 2005-12-27 16:25:10, schrieb Bernd Eckenfels:
 In article [EMAIL PROTECTED] you wrote:

 apt-get update  apt-get --download-only upgrade
 
 It would make more sense to send out the DIFFs to the packages.gz, so you
 dont actually need to download the packages file every five minutes.

But if the Packages.gz has not changed,
'apt-get update' does not download it.

OK, diff's would be better.

 Gruss
 Bernd

Greetings
Michelle

-- 
Linux-User #280138 with the Linux Counter, http://counter.li.org/
# Debian GNU/Linux Consultant #
Michelle Konzack   Apt. 917  ICQ #328449886
   50, rue de Soultz MSM LinuxMichi
0033/3/8845235667100 Strasbourg/France   IRC #Debian (irc.icq.com)


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-27 Thread Michelle Konzack
Am 2005-12-21 16:32:20, schrieb Goswin von Brederlow:

 I have 10240kBit downstream and get way less from security.debian.org.
 Especialy when there is a security release of X or latex.

They are two possibilitys:

[EMAIL PROTECTED] subscribe with an seperated E-Mail and track it.
Check the E-Mail in a delay of 5 minutes.  Write a script (we do not
want to download Packages.gz, if there is no Pakage of interest) which
check, whether the new package is installed on one of your systems or
not.  If installed, start immediatly an

apt-get update  apt-get --download-only upgrade

How many peoples use such system ?

You will get speed pur and you are one of the first, which get the
Package(s)

 MfG
 Goswin

Greetings
Michelle

-- 
Linux-User #280138 with the Linux Counter, http://counter.li.org/
# Debian GNU/Linux Consultant #
Michelle Konzack   Apt. 917  ICQ #328449886
   50, rue de Soultz MSM LinuxMichi
0033/3/8845235667100 Strasbourg/France   IRC #Debian (irc.icq.com)


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-27 Thread Bernd Eckenfels
In article [EMAIL PROTECTED] you wrote:
 [EMAIL PROTECTED] subscribe with an seperated E-Mail and track it.
 Check the E-Mail in a delay of 5 minutes.  Write a script (we do not
 want to download Packages.gz, if there is no Pakage of interest) which
 check, whether the new package is installed on one of your systems or
 not.  If installed, start immediatly an
 
apt-get update  apt-get --download-only upgrade

It would make more sense to send out the DIFFs to the packages.gz, so you
dont actually need to download the packages file every five minutes.

Gruss
Bernd


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-23 Thread Goswin von Brederlow
Olaf van der Spek [EMAIL PROTECTED] writes:

 On 12/21/05, Goswin von Brederlow [EMAIL PROTECTED] wrote:
  Who need PARALELISM and who has a bandwidth of more then 8 MBit?

 I have 10240kBit downstream and get way less from security.debian.org.
 Especialy when there is a security release of X or latex.

 But parallel downloads won't solve that.

Security has 3 server, each gives me a few K/s and don't get slower if
I use all three. Best case they realy do add up to 3 times the speed.

At a minimum it could use the fastest of the 3.

MfG
Goswin


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-21 Thread Goswin von Brederlow
Michelle Konzack [EMAIL PROTECTED] writes:

 Am 2005-12-12 13:23:01, schrieb Goswin von Brederlow:

 Actualy one thing apt could do:
 
 [EMAIL PROTECTED]:~% host security.debian.org
 security.debian.org A   82.94.249.158
 security.debian.org A   128.101.80.133
 security.debian.org A   194.109.137.218
 
 Why not open 3 connections one to each host?

 I have tested the Mirrors and there is a problem with my 8 MBit ADSL.
 Singel download or parallel I get all the time the same speed.

 7616 kBitor780 kByte/s

 Who need PARALELISM and who has a bandwidth of more then 8 MBit?

I have 10240kBit downstream and get way less from security.debian.org.
Especialy when there is a security release of X or latex.

 I have written a small apt-get wraper which use --print-uris and have
 piped it to the same number of wget processes backgrounded and
 disowned it... (it download all the files into /var/cache/apt/archives
 and after this  I install from this directory.

 Same resultat. No performance plus.

 No chance for the hell!  -  The Debian servers are faster.

If there is a slowdown it usualy is somewhere between debian and the
user. Connection speeds to various servers might widely vary.

 Greetings
 Michelle

MfG
Goswin


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-21 Thread Goswin von Brederlow
Michelle Konzack [EMAIL PROTECTED] writes:

 Am 2005-12-06 09:53:43, schrieb Ivan Adams:
 Hi again,
 in my case:
 I have slow internet connection. BUT I have friends with the same
   ^^^
 connection
 in my local area network, who have apt-proxy.
 My goal is: When I need to install new system (Debian) on new user, or
 dist-upgrade on entire system, I need the unstable packets from site.
 In
 this case I need to wait some HOURS. If apt have *PARALLELISM* , I
 could
 use  my connection and at the same time the connections of apt-proxy.
 In that case I will download the packets twice (or more) faster.

 ???

 Do you have two Dial-In line?
 1)  You = Internet = Debian-Server
 2)  You = your friends apt-proxy

More like ppp0 and eth0 I guess.


You should think about using cron-apt to download debian updates
during the night when you sleep (download, not install). That way the
apt-proxy will always have all your packages ready when you decide to
actualy install.

You might also switch to squid and setup your friends squid as peer so
they share the downloads they have.

MfG
Goswin


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-21 Thread Olaf van der Spek
On 12/21/05, Goswin von Brederlow [EMAIL PROTECTED] wrote:
  Who need PARALELISM and who has a bandwidth of more then 8 MBit?

 I have 10240kBit downstream and get way less from security.debian.org.
 Especialy when there is a security release of X or latex.

But parallel downloads won't solve that.


Re: apt PARALLELISM

2005-12-21 Thread Henrique de Moraes Holschuh
On Wed, 21 Dec 2005, Olaf van der Spek wrote:
 On 12/21/05, Goswin von Brederlow [EMAIL PROTECTED] wrote:
   Who need PARALELISM and who has a bandwidth of more then 8 MBit?
 
  I have 10240kBit downstream and get way less from security.debian.org.
  Especialy when there is a security release of X or latex.
 
 But parallel downloads won't solve that.

Yes, sometimes they do.  Those of us who experience it can tell you that,
and in fact, we did.

The question is whether it is *desired* to try to fix that with parallel
downloads.

-- 
  One disk to rule them all, One disk to find them. One disk to bring
  them all and in the darkness grind them. In the Land of Redmond
  where the shadows lie. -- The Silicon Valley Tarot
  Henrique Holschuh


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-20 Thread Michelle Konzack
Am 2005-12-05 16:11:35, schrieb Joe Smith:

 This person is requesting parallel downloads from multiple servers. So 
 basicly during package download, if there are three full and up-to-date 
 mirrors in sources.list, there should be simulatious downloads of different 
 packages from all three different mirrors.

For what?

I am using an ADSL with 8 MBit (= 830 KByte/Second) and
never had problems downloading one Package after one.

Since last week I have in Paris an E3 (34 MBit) and now
I can download as the hell.

Downloading from multiple servers parallel bring nothing.

 The concept is that in some cases this can noticable improve performance, 
 especially whith sites that bandwidth throtle, or have some other sort of 
 bottleneck.

Some other bottleneck?  Allowing 1000 Client-Connections to an E1?

 I would say this is a feature request, rather than a bug report of any 
 kind. 

:-)

Greetings
Michelle

-- 
Linux-User #280138 with the Linux Counter, http://counter.li.org/
# Debian GNU/Linux Consultant #
Michelle Konzack   Apt. 917  ICQ #328449886
   50, rue de Soultz MSM LinuxMichi
0033/3/8845235667100 Strasbourg/France   IRC #Debian (irc.icq.com)


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-20 Thread Michelle Konzack
Am 2005-12-06 09:53:43, schrieb Ivan Adams:
 Hi again,
 in my case:
 I have slow internet connection. BUT I have friends with the same
  ^^^
 connection
 in my local area network, who have apt-proxy.
 My goal is: When I need to install new system (Debian) on new user, or
 dist-upgrade on entire system, I need the unstable packets from site.
 In
 this case I need to wait some HOURS. If apt have *PARALLELISM* , I
 could
 use  my connection and at the same time the connections of apt-proxy.
 In that case I will download the packets twice (or more) faster.

???

Do you have two Dial-In line?
1)  You = Internet = Debian-Server
2)  You = your friends apt-proxy

???

If it goes the same connection, then it will no benefit.
I have tried around 40 Debian Mirrors and no one give me
less then 8 MBit which is my ADSL in France.

You can have only an avantage, if YOU run the apt-proxy for
your own network and you install more then one machine.

On the other hand you can mirror a Debian site (which I do
because I am installing/upgrading 5-20 Machines per week)

 Thank you!
 Best regards

Greetings
Michelle

-- 
Linux-User #280138 with the Linux Counter, http://counter.li.org/
# Debian GNU/Linux Consultant #
Michelle Konzack   Apt. 917  ICQ #328449886
   50, rue de Soultz MSM LinuxMichi
0033/3/8845235667100 Strasbourg/France   IRC #Debian (irc.icq.com)


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-20 Thread Michelle Konzack
Am 2005-12-12 13:23:01, schrieb Goswin von Brederlow:

 Actualy one thing apt could do:
 
 [EMAIL PROTECTED]:~% host security.debian.org
 security.debian.org A   82.94.249.158
 security.debian.org A   128.101.80.133
 security.debian.org A   194.109.137.218
 
 Why not open 3 connections one to each host?

I have tested the Mirrors and there is a problem with my 8 MBit ADSL.
Singel download or parallel I get all the time the same speed.

7616 kBitor780 kByte/s

Who need PARALELISM and who has a bandwidth of more then 8 MBit?

I have written a small apt-get wraper which use --print-uris and have
piped it to the same number of wget processes backgrounded and
disowned it... (it download all the files into /var/cache/apt/archives
and after this  I install from this directory.

Same resultat. No performance plus.

No chance for the hell!  -  The Debian servers are faster.

Greetings
Michelle

-- 
Linux-User #280138 with the Linux Counter, http://counter.li.org/
# Debian GNU/Linux Consultant #
Michelle Konzack   Apt. 917  ICQ #328449886
   50, rue de Soultz MSM LinuxMichi
0033/3/8845235667100 Strasbourg/France   IRC #Debian (irc.icq.com)


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-20 Thread Michelle Konzack
Am 2005-12-12 17:06:28, schrieb Bas Zoetekouw:

 But what would you gain from that?  In my experience, the mirrors are
 fast enough to saturate anything but the fastest (100Mb) links.  

FullACK!  -  OK, I have currently only an E3 in Paris,
but if all goes right I will get my own FiberOptic STM-1.

And can make new Tests...

Greetings
Michelle

-- 
Linux-User #280138 with the Linux Counter, http://counter.li.org/
# Debian GNU/Linux Consultant #
Michelle Konzack   Apt. 917  ICQ #328449886
   50, rue de Soultz MSM LinuxMichi
0033/3/8845235667100 Strasbourg/France   IRC #Debian (irc.icq.com)


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-14 Thread Martijn van Oosterhout
2005/12/13, Henrique de Moraes Holschuh [EMAIL PROTECTED]:

 Time to devise a way to teach it about that, then.  HOW to do it is the big
 problem, though.  How should one deal with round-robin DNS mirrors which are
 supposed to be equal, but are not.   What are the failure modes to cater
 for?

I'm not sure about all the failure modes but the two I can think of would be:

1. One of the mirrors out of sync
2. One of the mirrors down

ISTM the easiest would be for apt to lookup the hostname itself and
treat the single entry as a list of entires, one for each possible
address the hostname can resolve to. If one fails, try the next. If
you randomise the order you should be able to avoid most of the
failure modes...

Have a nice day,



Re: apt PARALLELISM

2005-12-14 Thread Robert Lemmen
On Wed, Dec 14, 2005 at 01:23:17PM +0100, Martijn van Oosterhout wrote:
 ISTM the easiest would be for apt to lookup the hostname itself and
 treat the single entry as a list of entires, one for each possible
 address the hostname can resolve to. If one fails, try the next.

apt already does that as far as i understand it

cu  robert

-- 
Robert Lemmen   http://www.semistable.com 


signature.asc
Description: Digital signature


Re: apt PARALLELISM

2005-12-14 Thread Claus Färber
Olaf van der Spek [EMAIL PROTECTED] schrieb/wrote:
 That's not true. Suppose you've only got 3 users. If each user
 connects to one (different) mirror, he gets 1/1 of that mirror's
 bandwidth. If each user connects to each mirror, he only gets 1/3 of
 that mirror's bandwidth.

They could get 1/1 of each server (total 3/1) if they connect at   
different times. With three users, this needs coordination (which makes  
the effect useless). With several hundred, it only needs statistics.

Claus
-- 
http://www.faerber.muc.de



-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-14 Thread Olaf van der Spek
On 13 Dec 2005 15:56:00 +0100, Claus Färber [EMAIL PROTECTED] wrote:
 Olaf van der Spek [EMAIL PROTECTED] schrieb/wrote:
  That's not true. Suppose you've only got 3 users. If each user
  connects to one (different) mirror, he gets 1/1 of that mirror's
  bandwidth. If each user connects to each mirror, he only gets 1/3 of
  that mirror's bandwidth.

 They could get 1/1 of each server (total 3/1) if they connect at
 different times.

True if you assume the users have three times the bandwidth of a
mirror (on average). A 'bit' unlikely.

 With three users, this needs coordination (which makes
 the effect useless). With several hundred, it only needs statistics.

Again only if the bottleneck is the mirror's bandwidth.


Re: apt PARALLELISM

2005-12-14 Thread Wesley J. Landaker
On Wednesday 14 December 2005 14:41, Olaf van der Spek wrote:
 On 13 Dec 2005 15:56:00 +0100, Claus Färber [EMAIL PROTECTED] 
wrote:
  Olaf van der Spek [EMAIL PROTECTED] schrieb/wrote:
   That's not true. Suppose you've only got 3 users. If each user
   connects to one (different) mirror, he gets 1/1 of that mirror's
   bandwidth. If each user connects to each mirror, he only gets 1/3 of
   that mirror's bandwidth.
 
  They could get 1/1 of each server (total 3/1) if they connect at
  different times.

 True if you assume the users have three times the bandwidth of a
 mirror (on average). A 'bit' unlikely.

It's not that simple; you have to count multiple users. If there are 500 
users accessing the mirror simultaneously, the mirror needs to have 500x 
the bandwidth of every user.

This isn't even taking into account the complexity of the internet routing 
in between, which can make multiple simultaneous sources faster--in actual 
wall-clock time--for the user no matter how fast or slow the user's or the 
mirror's connection is on average.

-- 
Wesley J. Landaker [EMAIL PROTECTED] xmpp:[EMAIL PROTECTED]
OpenPGP FP: 4135 2A3B 4726 ACC5 9094  0097 F0A9 8A4C 4CD6 E3D2


pgpHrkni4PEXL.pgp
Description: PGP signature


Re: apt PARALLELISM

2005-12-13 Thread Henning Makholm
Scripsit Henrique de Moraes Holschuh [EMAIL PROTECTED]

 THis is not something that would bother anyone if it is a single user... but
 if you have 10k users doing that, often close enough in time, well, things
 should get MUCH worse as far as I can see.  If they are doing this at random
 times in the day, OTOH, it would not be that bad, I guess.

That's what I mean. People don't synchronize their updates - certainly
I don't synchronize with anybody, and I don't know of any mechanism
that I *could* use to sync with anybody if I wanted to.

Assume a situation where mirror bandwidth is the limiting factor, and
imagine a world with 3 mirrors.  Say that during a certain time of the
day 600 users each minute start to download updated x.org packages.
Either they can do their download sequentially, choosing a random
server; then their download will be finished in 15 minutes, and each
server has a more-or-less constant 600/3*15 = 3000 connections
active. Alternatively each user can spread his load over all three
servers; his download now takes 5 minutes, and each server _still_
sees 600*5 = 3000 active connections at any time. Thus _all_ users get
it faster by parallelizing. We get the same result if only some users
parallelize - the mirrors do not see a diffence in load, the smart
users get things faster, and the sequentially downloading users get it
no slower than they would have otherwise.

The calculation becomes more murky if there is backbone congestion
which hits more than one mirror _and_ more than one end user. Then he
who opens more connections at a time (whether to one server or
several) will probably get an advantage at other users' expense.

But I don't think that backbone congestion is such a universal
condition that it should necessarily be the only scenario for making
moral decisions about what apt should be _able_ to do.

 Whether MY [a single individual] increased download speed is worth the extra
 load on the mirror network, and whether it WOULD increase the load on the
 mirror network is what we are asking here.

Hm, you are not even asking whether the mirror load would go up? What
_are_ you asking, then?

 (and for the people who can't read whole threads, my position is that we
 should never decrease the experience of a group of people to increase the
 experience of an individual).

I am questioning your assumption that doing parallel downloads will
necessarily decrease the experience of a group of people at all.

-- 
Henning MakholmVi skal nok ikke begynde at undervise hinanden i
den store regnekunst her, men jeg vil foreslå, at vi fra
 Kulturministeriets side sørger for at fremsende tallene og også
  give en beskrivelse af, hvordan man læser tallene. Tak for i dag!


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-13 Thread Olaf van der Spek
On 12/13/05, Henning Makholm [EMAIL PROTECTED] wrote:
 Assume a situation where mirror bandwidth is the limiting factor, and
 imagine a world with 3 mirrors.  Say that during a certain time of the
 day 600 users each minute start to download updated x.org packages.
 Either they can do their download sequentially, choosing a random
 server; then their download will be finished in 15 minutes, and each
 server has a more-or-less constant 600/3*15 = 3000 connections
 active. Alternatively each user can spread his load over all three
 servers; his download now takes 5 minutes, and each server _still_
 sees 600*5 = 3000 active connections at any time. Thus _all_ users get

That's not true. Suppose you've only got 3 users. If each user
connects to one (different) mirror, he gets 1/1 of that mirror's
bandwidth. If each user connects to each mirror, he only gets 1/3 of
that mirror's bandwidth.


Re: apt PARALLELISM

2005-12-13 Thread Henning Makholm
Scripsit Olaf van der Spek [EMAIL PROTECTED]
 On 12/13/05, Henning Makholm [EMAIL PROTECTED] wrote:

 Alternatively each user can spread his load over all three servers;
 his download now takes 5 minutes, and each server _still_ sees
 600*5 = 3000 active connections at any time. Thus _all_ users get

 That's not true. Suppose you've only got 3 users. If each user
 connects to one (different) mirror, he gets 1/1 of that mirror's
 bandwidth.

No he won't, because the 14 users who started in the previous 14
minutes have not finished downloading yet. He can get 1/15 of the
mirror's bandwidth.

 If each user connects to each mirror, he only gets 1/3 of that
 mirror's bandwidth.

No. There will now be three new users connecting to the server that
minute, but because all of the _previous_ users have finished faster,
only the users from the previous *four* minutes will still be
downloading. So each of the three new users get 1/15 of the server
capacity (now 15 is 3 users from each of the previous 4 minutes plus
three new users), but now each of them gets 1/15 of _each_ server's
capacity.

-- 
Henning Makholm We're trying to get it into the
parts per billion range, but no luck still.


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-12 Thread Ivan Adams
Thanks ...We don't want them to open multiple connections even to MULTIPLE servers...



Re: apt PARALLELISM

2005-12-12 Thread Martijn van Oosterhout
2005/12/12, Henrique de Moraes Holschuh [EMAIL PROTECTED]:
We don't want them to open multiple connections even to MULTIPLE servers...
That's odd though, because apt *does* open connections to multiple
servers all the time. To fetch packages lists, or if a package is only
available on one of the servers further down.

Secondly, the amount of data to be downloaded is
independant of the time it takes, thus, in aggregate, whether apt
parallelizes or not won't make any difference to the total bandwidth
used, although it may shift more load to the ftp2 servers since they
never get used in normal usage.

Finally, how much of these slowdowns reported by people are caused by
the bandwidth delay product. In that case, two servers will definitly
be able to use more than a single server by itself... I didn't think it
common practice for large mirror to configure multi-megabyte windows...

Have a nice day,


Re: apt PARALLELISM

2005-12-12 Thread Goswin von Brederlow
Wouter Verhelst [EMAIL PROTECTED] writes:

 On Sun, Dec 11, 2005 at 02:45:50AM +0100, Marco d'Itri wrote:
 On Dec 11, Charles Fry [EMAIL PROTECTED] wrote:
 
  But if multiple URLs could satisfactorily serve requests for a single
  repository, only one of them is currently used.
 Which is fine, because we do not want people to open multiple
 connections to the same server.

 True, but that's not what's being asked here. If multiple URLs could
 serve requests for a single repository---i.e., if you've got both 
 deb http://ftp1.CC.debian.org/debian unstable main
 and
 deb http://ftp2.CC.debian.org/debian unstable main
 in your sources.list, then apt will download everything from
 ftp1.CC.debian.org (bar those files it gets an error on at that server;
 in that case, it will download them from the second server).

 Which is, indeed, silly.

Unless you have

file://mnt/mirror/debian unstable main
http://ftp.de.debian.org/debian unstable main
http://ftp.debian.org/debian unstable main

where you certainly want to get all files locally and only fall back
to the web on errors and there also first use the german mirror and
only fall back to ftp.d.o on error.


If parallelism like this gets added then there has to be a way to
enable/disbale it specifically. It is not generaly a good thing.

MfG
Goswin


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-12 Thread Ivan Adams
Can have option in /etc/apt/apt.conf or apt-get (install || upgrade) with -? some char here for parallelism.And by default can be disabled ... 


Re: apt PARALLELISM

2005-12-12 Thread Goswin von Brederlow
Martijn van Oosterhout [EMAIL PROTECTED] writes:

 2005/12/12, Henrique de Moraes Holschuh [EMAIL PROTECTED]:

   We don't want them to open multiple connections even to
  MULTIPLE servers...


 That's odd though, because apt *does* open connections to multiple servers all
 the time. To fetch packages lists, or if a package is only available on one of
 the servers further down.


 Secondly, the amount of data to be downloaded is independant of the time it
 takes, thus, in aggregate, whether apt parallelizes or not won't make any
 difference to the total bandwidth used, although it may shift more load to the
 ftp2 servers since they never get used in normal usage.
 Finally, how much of these slowdowns reported by people are caused by the
 bandwidth delay product. In that case, two servers will definitly be able to
 use more than a single server by itself... I didn't think it common practice
 for large mirror to configure multi-megabyte windows...
 Have a nice day,

Actualy one thing apt could do:

[EMAIL PROTECTED]:~% host security.debian.org
security.debian.org A   82.94.249.158
security.debian.org A   128.101.80.133
security.debian.org A   194.109.137.218

Why not open 3 connections one to each host?

Or at least fall back to the other IPs if the first one gives an
error?

MfG
Goswin


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-12 Thread Henrique de Moraes Holschuh
On Mon, 12 Dec 2005, Martijn van Oosterhout wrote:
 2005/12/12, Henrique de Moraes Holschuh [EMAIL PROTECTED]:
  We don't want them to open multiple connections even to MULTIPLE
  servers...
 
 That's odd though, because apt *does* open connections to multiple servers
 all the time. To fetch packages lists, or if a package is only available on
 one of the servers further down.

Yah.  It is supposed to be the lesser of two evils, I think.  With the new
differential packaging lists, it will actually be a proper balance between
mirror load and user experience (see apt in experimental).

 Secondly, the amount of data to be downloaded is independant of the time it
 takes, thus, in aggregate, whether apt parallelizes or not won't make any
 difference to the total bandwidth used, although it may shift more load to
 the ftp2 servers since they never get used in normal usage.

It will make difference for people trying to download at the same time. I
have made this point a number of times again.

Let me get it clear:

1. We care about a large lot of people a lot more than we care for an
   individual's downloading speed

2. Thus we try to keep the mirror load down, and downloading hundreds of
   megabytes using multiple connections to multiple sources of the same file
   is heavily fronwed upon.

3. If one manages to prove that the best way to archieve (2) is through n
   parallel connections to the same mirror, or to n parallel connections to
   different mirrors, be my guest.

 Finally, how much of these slowdowns reported by people are caused by the
 bandwidth delay product. In that case, two servers will definitly be able to

I won't claim this is what happens in Internet-paradise countries, but here
there are two things that affect download speed the most after you get a
last-mile link that is not too slow (say, 384kbit/s or more):

  TCP/IP (roundtrip, packet loss, windows)
  ISP backbone link congestion

Whichever one is the worst bottleneck changes along the day.  When the
cable/ADSL/radios are unstable, packet loss can cause staggering slowdowns.
During the more busy hours, the backbone limitantion becomes apparent.
During the wee hours of the night on long holidays, TCP/IP is the one
limiting the speed of a single connection.

 use more than a single server by itself... I didn't think it common practice
 for large mirror to configure multi-megabyte windows...

TCP/IP windows?  Or user bw shaping?

-- 
  One disk to rule them all, One disk to find them. One disk to bring
  them all and in the darkness grind them. In the Land of Redmond
  where the shadows lie. -- The Silicon Valley Tarot
  Henrique Holschuh


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-12 Thread Marco d'Itri
On Dec 12, Goswin von Brederlow [EMAIL PROTECTED] wrote:

 Why not open 3 connections one to each host?
Why do?

 Or at least fall back to the other IPs if the first one gives an
 error?
I hope that this already happens...

-- 
ciao,
Marco


signature.asc
Description: Digital signature


Re: apt PARALLELISM

2005-12-12 Thread Marco d'Itri
On Dec 12, Henrique de Moraes Holschuh [EMAIL PROTECTED] wrote:

  use more than a single server by itself... I didn't think it common practice
  for large mirror to configure multi-megabyte windows...
 TCP/IP windows?  Or user bw shaping?
He means the TCP window size, and it *is* common practice for large
mirrors to tune it.

-- 
ciao,
Marco


signature.asc
Description: Digital signature


Re: apt PARALLELISM

2005-12-12 Thread Henrique de Moraes Holschuh
On Mon, 12 Dec 2005, Marco d'Itri wrote:
 On Dec 12, Henrique de Moraes Holschuh [EMAIL PROTECTED] wrote:
   use more than a single server by itself... I didn't think it common 
   practice
   for large mirror to configure multi-megabyte windows...
  TCP/IP windows?  Or user bw shaping?
 He means the TCP window size, and it *is* common practice for large
 mirrors to tune it.

Yeah, thought so.  We certainly tune TCP/IP (and just about everything else)
in all fileservers here for maximum throughput, be it over http, ftp, CIFS
or NFS.  If there is a reason to shape bandwidth, it is done through other
means at the outbond QoS shaper.  That's why I found it strange that he
asked about multi-megabyte windows.

-- 
  One disk to rule them all, One disk to find them. One disk to bring
  them all and in the darkness grind them. In the Land of Redmond
  where the shadows lie. -- The Silicon Valley Tarot
  Henrique Holschuh


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-12 Thread Henning Makholm
Scripsit Henrique de Moraes Holschuh [EMAIL PROTECTED]

 1. We care about a large lot of people a lot more than we care for an
individual's downloading speed

 2. Thus we try to keep the mirror load down, and downloading hundreds of
megabytes using multiple connections to multiple sources of the same file
is heavily fronwed upon.

Of course, trying to download the _same_ file from several different
servers simultaneously would be very wasteful. However it seems to be
not what the proposal in this thread is about.

As far as I read the proposal, it is about downloading _different_
files from different mirrors - if you have 25 packages to get for your
'apt-get update' operation, download 5 packages from each of 5
different servers, with one connection to each server active at a
time.

While I cannot see any very common situation where such parallellism
would be an advantage, it is not clear that it would increase the load
of any or all servers.

At least, I cannot see that there would be any ill effects of a
hypothetical pseudo-parallel implementation that downloads 5 packages
from each of the 5 servers, but sequentially such that only a single
connection to a single server is active. And the difference from
_that_ to an actual parallel implementation is just to shift the
connections each server experiences a bit in time - the number of KB
served by each server stays constant.

Is your point that a server prefers to push bytes through the
connection at a constant rate, and starts wasting resources if the
available bandwidth fluctuates because the last-mile ADSL has to be
shared with a shifting number of parallel downloads from other
servers? But when the bottleneck is closest to the client, enabling
parallel downloads would not make much sense anyway.

(Of course, Goswin has a valid point that some people have their
sources.list deliberately written with a remote, undesirable, server
at the end as a _fallback_ option. Therefore parallelism should at
best be an _option_, not something that apt starts doing unbidden).

 I won't claim this is what happens in Internet-paradise countries, but here
 there are two things that affect download speed the most after you get a
 last-mile link that is not too slow (say, 384kbit/s or more):

I have 768 kb/s at home, and my apt updates through that pipe operate
close to its peak capacity. But they are at least one order of
magnitude slower than from my desk at work (which is just two or three
100+ Mb/s hops away from the national research backbone). Same mirror
in both cases.

From that experience, a last-mile link in the 1 Mb/s range would still
seem to be the limiting factor - and therefore people at the end of
such links would have little use for parallelism in the first place.

-- 
Henning Makholm  What has it got in its pocketses?


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-12 Thread Wouter Verhelst
On Mon, Dec 12, 2005 at 12:52:09PM +0100, Goswin von Brederlow wrote:
 Wouter Verhelst [EMAIL PROTECTED] writes:
  True, but that's not what's being asked here. If multiple URLs could
  serve requests for a single repository---i.e., if you've got both 
  deb http://ftp1.CC.debian.org/debian unstable main
  and
  deb http://ftp2.CC.debian.org/debian unstable main
  in your sources.list, then apt will download everything from
  ftp1.CC.debian.org (bar those files it gets an error on at that server;
  in that case, it will download them from the second server).
 
  Which is, indeed, silly.
 
 Unless you have
 
 file://mnt/mirror/debian unstable main
 http://ftp.de.debian.org/debian unstable main
 http://ftp.debian.org/debian unstable main
 
 where you certainly want to get all files locally and only fall back
 to the web on errors and there also first use the german mirror and
 only fall back to ftp.d.o on error.
 
 If parallelism like this gets added then there has to be a way to
 enable/disbale it specifically. It is not generaly a good thing.

I'd rather think you'd only want to parallellize per protocol. I.e.,
only get stuff from http:// URIs if you can't get them from file://
URIs, etc.

On your point with one mirror being farther away than other mirrors, I
don't think that's such a problem; this could be handled by a smart
enough algorithm to distribute packages to the fetchers. For example,
first sort them so that large packages are at the end; then start
downloading one package from each mirror, measuring the time it takes to
perform that download; and when you have enough data to be sure, give
the larger packages to the faster mirrors, and the smaller packages to
the slower ones.

Such an algorithm should be more than enough for the common setup. There
will indeed be cases where that isn't going to be enough (i.e., you've
got two mirrors that are approximately as fast, but one is on the other
end of a line where you pay per downloaded byte while the other isn't),
so a way to disable parallellism will still be welcome; but I don't
think it needs to remain disabled by default.

That being said, I don't have either the time or teh skillz to implement
all this, so I'll shut up now.

-- 
.../ -/ ---/ .--./ / .--/ .-/ .../ -/ ../ -./ --./ / -.--/ ---/ ..-/ .-./ / -/
../ --/ ./ / .--/ ../ -/ / / -../ ./ -.-./ ---/ -../ ../ -./ --./ / --/
-.--/ / .../ ../ --./ -./ .-/ -/ ..-/ .-./ ./ .-.-.-/ / --/ ---/ .-./ .../ ./ /
../ .../ / ---/ ..-/ -/ -../ .-/ -/ ./ -../ / -/ ./ -.-./ / -./ ---/ .-../
---/ --./ -.--/ / .-/ -./ -.--/ .--/ .-/ -.--/ .-.-.-/ / ...-.-/


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-12 Thread Henrique de Moraes Holschuh
On Mon, 12 Dec 2005, Henning Makholm wrote:
 As far as I read the proposal, it is about downloading _different_
 files from different mirrors - if you have 25 packages to get for your
 'apt-get update' operation, download 5 packages from each of 5
 different servers, with one connection to each server active at a
 time.
 
 While I cannot see any very common situation where such parallellism
 would be an advantage, it is not clear that it would increase the load
 of any or all servers.

It would be an advantage *to the receiving end* if TCP/IP is the limiting
factor, as it compresses in time the number of connections made, thus more
of them are active at the same time and not interfering with one another.

Depending on how the queues work, even when the ISP backbone is full, trying
for more connections might increase your overall transfer speed *at the
undeniable fact that you will be making it worse for everyone else in the
ISP*.

OTOH, this compresses in time the resources used by a single individual.
Whether this translates to diminished experience for a large group of
individuals (which will also have compressed their resource usage profile in
time) or not, is not such a simple question.

 from each of the 5 servers, but sequentially such that only a single
 connection to a single server is active. And the difference from
 _that_ to an actual parallel implementation is just to shift the
 connections each server experiences a bit in time - the number of KB
 served by each server stays constant.

The bandwidth is constant, yes.  The ammount of active connections and the
aggregate flow speed is not, it increases as you have compressed time.  I.e.
you use more resources for a shorter time.  

THis is not something that would bother anyone if it is a single user... but
if you have 10k users doing that, often close enough in time, well, things
should get MUCH worse as far as I can see.  If they are doing this at random
times in the day, OTOH, it would not be that bad, I guess.

 Is your point that a server prefers to push bytes through the
 connection at a constant rate, and starts wasting resources if the

Constant _total_ average flow rates are *always* the best to work with IMHO,
but that was not what I was talking about.

 servers? But when the bottleneck is closest to the client, enabling
 parallel downloads would not make much sense anyway.

They do.  I have experienced them, I have a 4Mbit/s cable downlink at the
moment, I can assure you that, unless the ISP is having trouble on the
last-mile feeder (i.e. extreme packet loss, trying to pump more in the wire
just makes things worse), it improves my download speed to have multiple
connections (it doesn't matter if I am transfering the same data or not, I
am talking about the aggregate flow here).

Whether MY [a single individual] increased download speed is worth the extra
load on the mirror network, and whether it WOULD increase the load on the
mirror network is what we are asking here.

(and for the people who can't read whole threads, my position is that we
should never decrease the experience of a group of people to increase the
experience of an individual).

 (Of course, Goswin has a valid point that some people have their
 sources.list deliberately written with a remote, undesirable, server
 at the end as a _fallback_ option. Therefore parallelism should at
 best be an _option_, not something that apt starts doing unbidden).

Agreed.

 From that experience, a last-mile link in the 1 Mb/s range would still
 seem to be the limiting factor - and therefore people at the end of
 such links would have little use for parallelism in the first place.

That's not how it works when you have shitty backbone connectivity, like in
Brazil.  It doesn't matter if they deliver 4Mbit/s to your home, the
network in the middle is crap when compared to what I've seen in the USA and
Europe.

-- 
  One disk to rule them all, One disk to find them. One disk to bring
  them all and in the darkness grind them. In the Land of Redmond
  where the shadows lie. -- The Silicon Valley Tarot
  Henrique Holschuh


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-12 Thread Ivan Adams
As far as I read the proposal, it is about downloading _different_files from different mirrors - if you have 25 packages to get for your
'apt-get update' operation, download 5 packages from each of 5different servers, with one connection to each server active at atime.That is what I mean ... 


Re: apt PARALLELISM

2005-12-12 Thread Olaf van der Spek
On 12/12/05, Ivan Adams [EMAIL PROTECTED] wrote:


  As far as I read the proposal, it is about downloading _different_
  files from different mirrors - if you have 25 packages to get for your
  'apt-get update' operation, download 5 packages from each of 5
  different servers, with one connection to each server active at a
  time.
 

 That is what I mean ...

But with HTTP pipelining you can download all 25 files with only a
single TCP connection.
Using 5 TCP connections (to different servers) instead of 1 simply
means you're using more resources for yourself.


Re: apt PARALLELISM

2005-12-12 Thread Ivan Adams
My goal is using more bandwidth with apt-proxy servers on my friends, who have other internet connection.And I want to download first packet from my internet and second packet at the same time from the apt-proxy with my friend internet connection.
Now, I can only download all packets from my internet, and if it fails will go to the apt-proxy ...


Re: apt PARALLELISM

2005-12-12 Thread Linas Zvirblis

Ivan Adams wrote:


My goal is using more bandwidth with apt-proxy servers on my friends, who
have other internet connection.
And I want to download first packet from my internet and second packet at
the same time from the apt-proxy with my friend internet connection.

Now, I can only download all packets from my internet, and if it fails will
go to the apt-proxy ...


Sounds like your /etc/apt/sources.list does not match your expectations. 
Servers will be used in the order they appear in the file.



--
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-12 Thread Bas Zoetekouw
Hi Ivan!

You wrote:

 
 As far as I read the proposal, it is about downloading _different_
 files from different mirrors - if you have 25 packages to get for your
 'apt-get update' operation, download 5 packages from each of 5
 different servers, with one connection to each server active at a
 time.
 
 That is what I mean ...  

But what would you gain from that?  In my experience, the mirrors are
fast enough to saturate anything but the fastest (100Mb) links.  

-- 
Kind regards,
++
| Bas Zoetekouw  | GPG key: 0644fab7 |
|| Fingerprint: c1f5 f24c d514 3fec 8bf6 |
| [EMAIL PROTECTED], [EMAIL PROTECTED] |  a2b1 2bae e41f 0644 fab7 |
++ 


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-12 Thread Simon Richter

Hi,

Bas Zoetekouw wrote:


But what would you gain from that?  In my experience, the mirrors are
fast enough to saturate anything but the fastest (100Mb) links.  


I think the idea is

a) load-balancing over multiple DSL lines
b) checking a bunch of apt-proxy servers whether they can provide the 
file instantantaneously before asking one to actually get the file (with 
the added constraint that a different server be queried for every file 
not present anywhere, so the load balancing will still work).


My suggestion would be to simply drop apt-proxy and turn towards squid, 
which can do exactly that (there is an inter-proxy protocol); it will 
take a bit to configure it to let .deb files live longer than other 
files (since they don't change), but should not be too hard). The only 
thing it doesn't do is load balancing the DSL lines.


   Simon


signature.asc
Description: OpenPGP digital signature


Re: apt PARALLELISM

2005-12-12 Thread Joey Hess
Marco d'Itri wrote:
  Or at least fall back to the other IPs if the first one gives an
  error?
 I hope that this already happens...

apt doesn't know anything about round robin dns, and especially with
secure apt, if one mirror gets out of sync things break horribly. This
recently happened with http.us.debian.org which had one mirror desynced
for a week or more (or perhaps still; I had to stop using it because of
that).

-- 
see shy jo


signature.asc
Description: Digital signature


Re: apt PARALLELISM

2005-12-12 Thread Henrique de Moraes Holschuh
On Mon, 12 Dec 2005, Joey Hess wrote:
 Marco d'Itri wrote:
   Or at least fall back to the other IPs if the first one gives an
   error?
  I hope that this already happens...
 
 apt doesn't know anything about round robin dns, and especially with
 secure apt, if one mirror gets out of sync things break horribly. This

Time to devise a way to teach it about that, then.  HOW to do it is the big
problem, though.  How should one deal with round-robin DNS mirrors which are
supposed to be equal, but are not.   What are the failure modes to cater
for?

-- 
  One disk to rule them all, One disk to find them. One disk to bring
  them all and in the darkness grind them. In the Land of Redmond
  where the shadows lie. -- The Silicon Valley Tarot
  Henrique Holschuh


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-11 Thread Wouter Verhelst
On Sun, Dec 11, 2005 at 02:45:50AM +0100, Marco d'Itri wrote:
 On Dec 11, Charles Fry [EMAIL PROTECTED] wrote:
 
  But if multiple URLs could satisfactorily serve requests for a single
  repository, only one of them is currently used.
 Which is fine, because we do not want people to open multiple
 connections to the same server.

True, but that's not what's being asked here. If multiple URLs could
serve requests for a single repository---i.e., if you've got both 
deb http://ftp1.CC.debian.org/debian unstable main
and
deb http://ftp2.CC.debian.org/debian unstable main
in your sources.list, then apt will download everything from
ftp1.CC.debian.org (bar those files it gets an error on at that server;
in that case, it will download them from the second server).

Which is, indeed, silly.

-- 
.../ -/ ---/ .--./ / .--/ .-/ .../ -/ ../ -./ --./ / -.--/ ---/ ..-/ .-./ / -/
../ --/ ./ / .--/ ../ -/ / / -../ ./ -.-./ ---/ -../ ../ -./ --./ / --/
-.--/ / .../ ../ --./ -./ .-/ -/ ..-/ .-./ ./ .-.-.-/ / --/ ---/ .-./ .../ ./ /
../ .../ / ---/ ..-/ -/ -../ .-/ -/ ./ -../ / -/ ./ -.-./ / -./ ---/ .-../
---/ --./ -.--/ / .-/ -./ -.--/ .--/ .-/ -.--/ .-.-.-/ / ...-.-/


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-11 Thread Marco d'Itri
On Dec 11, Wouter Verhelst [EMAIL PROTECTED] wrote:

 True, but that's not what's being asked here. If multiple URLs could
 serve requests for a single repository---i.e., if you've got both 
 deb http://ftp1.CC.debian.org/debian unstable main
 and
 deb http://ftp2.CC.debian.org/debian unstable main
 in your sources.list, then apt will download everything from
 ftp1.CC.debian.org (bar those files it gets an error on at that server;
 in that case, it will download them from the second server).
It's not clear which problem this would solve.
If a single mirror cannot saturate the network link then maybe a
different mirror should be used.

-- 
ciao,
Marco


signature.asc
Description: Digital signature


Re: apt PARALLELISM

2005-12-11 Thread Henrique de Moraes Holschuh
On Sun, 11 Dec 2005, Marco d'Itri wrote:
 On Dec 11, Charles Fry [EMAIL PROTECTED] wrote:
  But if multiple URLs could satisfactorily serve requests for a single
  repository, only one of them is currently used.
 Which is fine, because we do not want people to open multiple
 connections to the same server.

We don't want them to open multiple connections even to MULTIPLE servers...

-- 
  One disk to rule them all, One disk to find them. One disk to bring
  them all and in the darkness grind them. In the Land of Redmond
  where the shadows lie. -- The Silicon Valley Tarot
  Henrique Holschuh


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-10 Thread Cosimo Alfarano
On Tue, Dec 06, 2005 at 09:09:39AM -0200, Henrique de Moraes Holschuh wrote:
 I doubt very much so parallel downloads will be added to apt.

Could be added Release-file based.
Parallelize only unrelated pools, and disabled by default to avoid
conflict wiht apt-proxy (if any).

ie: Debian and Foo vendors can be parallelized without jeopardize any
mirror maintainer effort, since they're completely disjointed.

just my 2 cents,
c.
-- 
Cosimo Alfarano, kalfa at {bononia.it,debian.org,cs.unibo.it}
0DBD 8FCC 4F6B 8D41 8F43  63A1 E43B 153C CB46 7E27
foaf://www.bononia.it/~kalfa/foaf.rdf


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-10 Thread Stephen Gran
This one time, at band camp, Cosimo Alfarano said:
 On Tue, Dec 06, 2005 at 09:09:39AM -0200, Henrique de Moraes Holschuh wrote:
  I doubt very much so parallel downloads will be added to apt.
 
 Could be added Release-file based.
 Parallelize only unrelated pools, and disabled by default to avoid
 conflict wiht apt-proxy (if any).

Seperate repositiories are already parallelized, which is what I think
you are asking for here.
-- 
 -
|   ,''`.Stephen Gran |
|  : :' :[EMAIL PROTECTED] |
|  `. `'Debian user, admin, and developer |
|`- http://www.debian.org |
 -


signature.asc
Description: Digital signature


Re: apt PARALLELISM

2005-12-10 Thread Charles Fry
 Seperate repositiories are already parallelized, which is what I think
 you are asking for here.

But if multiple URLs could satisfactorily serve requests for a single
repository, only one of them is currently used.

Ideally, the amount of parallelism could match the number of redundant
URLs provided for a single type of source.

Charles

-- 
Substitutes
Resemble
Tail-chasing pup
Follow and follow
But never catch up
Burma-Shave
http://burma-shave.org/jingles/1941/substitutes


signature.asc
Description: Digital signature


Re: apt PARALLELISM

2005-12-10 Thread Marco d'Itri
On Dec 11, Charles Fry [EMAIL PROTECTED] wrote:

 But if multiple URLs could satisfactorily serve requests for a single
 repository, only one of them is currently used.
Which is fine, because we do not want people to open multiple
connections to the same server.

-- 
ciao,
Marco


signature.asc
Description: Digital signature


Re: apt PARALLELISM

2005-12-06 Thread Henrique de Moraes Holschuh
On Tue, 06 Dec 2005, Ivan Adams wrote:
 I have slow internet connection. BUT I have friends with the same connection
 in my local area network, who have apt-proxy.
 My goal is: When I need to install new system (Debian) on new user, or
 dist-upgrade on entire system, I need the unstable packets from site. In
 this case I need to wait some HOURS. If apt have *PARALLELISM* , I could
 use  my connection and at the same time the connections of apt-proxy.
 In that case I will download the packets twice (or more) faster.

You could also have a (partial) local mirror that you update over the night,
or use only apt-proxy connections in all machines but one, and have that one
do regular updates to keep apt-proxy fresh with new packages.

I doubt very much so parallel downloads will be added to apt.

-- 
  One disk to rule them all, One disk to find them. One disk to bring
  them all and in the darkness grind them. In the Land of Redmond
  where the shadows lie. -- The Silicon Valley Tarot
  Henrique Holschuh


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-06 Thread George Danchev
On Tuesday 06 December 2005 13:09, Henrique de Moraes Holschuh wrote:
 On Tue, 06 Dec 2005, Ivan Adams wrote:
  I have slow internet connection. BUT I have friends with the same
  connection in my local area network, who have apt-proxy.
  My goal is: When I need to install new system (Debian) on new user, or
  dist-upgrade on entire system, I need the unstable packets from site. In
  this case I need to wait some HOURS. If apt have *PARALLELISM* , I could
  use  my connection and at the same time the connections of apt-proxy.
  In that case I will download the packets twice (or more) faster.

 You could also have a (partial) local mirror that you update over the
 night, or use only apt-proxy connections in all machines but one, and have
 that one do regular updates to keep apt-proxy fresh with new packages.

 I doubt very much so parallel downloads will be added to apt.

apt-torrent seems to approach that too: 
http://sianka.free.fr/documentation.html

-- 
pub 4096R/0E4BD0AB 2003-03-18 people.fccf.net/danchev/key pgp.mit.edu
fingerprint 1AE7 7C66 0A26 5BFF DF22 5D55 1C57 0C89 0E4B D0AB 


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-06 Thread Henrique de Moraes Holschuh
On Tue, 06 Dec 2005, George Danchev wrote:
 apt-torrent seems to approach that too: 
 http://sianka.free.fr/documentation.html

Now, THAT is something nice. BitTorrent won't overload the mirrors ever.

-- 
  One disk to rule them all, One disk to find them. One disk to bring
  them all and in the darkness grind them. In the Land of Redmond
  where the shadows lie. -- The Silicon Valley Tarot
  Henrique Holschuh


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-05 Thread Olaf van der Spek
On 12/5/05, Ivan Adams [EMAIL PROTECTED] wrote:
 Example: (/etc/apt/sources.list)
 deb http://ftp.en.debian.org/debian main stable contrib non-free
 deb http://ftp.de.debian.org/debian main stable contrib non-free

 in this case the stable packages will be ONLY downloaded from first server
 from the list ...

And what is the problem?


Re: apt PARALLELISM

2005-12-05 Thread Joe Smith


Olaf van der Spek [EMAIL PROTECTED] wrote in message 
news:[EMAIL PROTECTED]

On 12/5/05, Ivan Adams [EMAIL PROTECTED] wrote:

Example: (/etc/apt/sources.list)
deb http://ftp.en.debian.org/debian main stable contrib non-free
deb http://ftp.de.debian.org/debian main stable contrib non-free

in this case the stable packages will be ONLY downloaded from first 
server

from the list ...


And what is the problem?


This person is requesting parallel downloads from multiple servers. So 
basicly during package download, if there are three full and up-to-date 
mirrors in sources.list, there should be simulatious downloads of different 
packages from all three different mirrors.
The concept is that in some cases this can noticable improve performance, 
especially whith sites that bandwidth throtle, or have some other sort of 
bottleneck.


I would say this is a feature request, rather than a bug report of any kind. 




--
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-05 Thread Olaf van der Spek
On 12/5/05, Joe Smith [EMAIL PROTECTED] wrote:

 Olaf van der Spek [EMAIL PROTECTED] wrote in message
 news:[EMAIL PROTECTED]
  On 12/5/05, Ivan Adams [EMAIL PROTECTED] wrote:
  Example: (/etc/apt/sources.list)
  deb http://ftp.en.debian.org/debian main stable contrib non-free
  deb http://ftp.de.debian.org/debian main stable contrib non-free
 
  in this case the stable packages will be ONLY downloaded from first
  server
  from the list ...
 
  And what is the problem?

 This person is requesting parallel downloads from multiple servers. So
 basicly during package download, if there are three full and up-to-date
 mirrors in sources.list, there should be simulatious downloads of different
 packages from all three different mirrors.
 The concept is that in some cases this can noticable improve performance,
 especially whith sites that bandwidth throtle, or have some other sort of
 bottleneck.

Do you mean throttling at the mirror site? Or between the mirror and
the end-user?
If the global (world) overhead of parallel downloads increases it may
not be a good idea to do it.


Re: apt PARALLELISM

2005-12-05 Thread Joe Smith


Olaf van der Spek [EMAIL PROTECTED] wrote in message 
news:[EMAIL PROTECTED]

On 12/5/05, Joe Smith [EMAIL PROTECTED] wrote:


Olaf van der Spek [EMAIL PROTECTED] wrote in message
news:[EMAIL PROTECTED]
 On 12/5/05, Ivan Adams [EMAIL PROTECTED] wrote:
 Example: (/etc/apt/sources.list)
 deb http://ftp.en.debian.org/debian main stable contrib non-free
 deb http://ftp.de.debian.org/debian main stable contrib non-free

 in this case the stable packages will be ONLY downloaded from first
 server
 from the list ...

 And what is the problem?

This person is requesting parallel downloads from multiple servers. So
basicly during package download, if there are three full and up-to-date
mirrors in sources.list, there should be simulatious downloads of 
different

packages from all three different mirrors.
The concept is that in some cases this can noticable improve performance,
especially whith sites that bandwidth throtle, or have some other sort of
bottleneck.


Do you mean throttling at the mirror site? Or between the mirror and
the end-user?
Either. It is possible that a router could be throttling the flow rate to a 
network owned by annother company. Other possible cases are where a user has 
connections speeds higher than some of the servers. (for example, some rich 
user could have multi-T3). I'm not sure it is needed, but I do understand 
that in some cases such a feature may be usefull.


Now it is useless for users where the bottleneck is on their end. 




--
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-05 Thread Henrique de Moraes Holschuh
On Mon, 05 Dec 2005, Joe Smith wrote:
 Olaf van der Spek [EMAIL PROTECTED] wrote in message 
 news:[EMAIL PROTECTED]
 On 12/5/05, Ivan Adams [EMAIL PROTECTED] wrote:
 Example: (/etc/apt/sources.list)
 deb http://ftp.en.debian.org/debian main stable contrib non-free
 deb http://ftp.de.debian.org/debian main stable contrib non-free
 
 in this case the stable packages will be ONLY downloaded from first 
 server
 from the list ...
 
 And what is the problem?

It is like that by design AFAIK.

 This person is requesting parallel downloads from multiple servers. So 

The mirror network is there to serve as many users as we possibly can at the
same time, at reasonable speeds.  Parallel connections to the same mirror or
to a number of different ones might jeopardize that goal.

 basicly during package download, if there are three full and up-to-date 
 mirrors in sources.list, there should be simulatious downloads of different 
 packages from all three different mirrors.

 The concept is that in some cases this can noticable improve performance, 
 especially whith sites that bandwidth throtle, or have some other sort of 
 bottleneck.

The problem with that concept is that it has the very likely effect of
degrading performance for everyone else.

-- 
  One disk to rule them all, One disk to find them. One disk to bring
  them all and in the darkness grind them. In the Land of Redmond
  where the shadows lie. -- The Silicon Valley Tarot
  Henrique Holschuh


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-05 Thread Romain Beauxis
Le Mardi 6 Décembre 2005 02:50, Joe Smith a écrit :
 Now it is useless for users where the bottleneck is on their end.

Well, it can also be usefull in case of a broken mirror can't it?


Romain
-- 
Not even the dog
That piss against the wall of Babylon,
Shall escape his judgement



Re: apt PARALLELISM

2005-12-05 Thread Henrique de Moraes Holschuh
On Tue, 06 Dec 2005, Romain Beauxis wrote:
 Le Mardi 6 Décembre 2005 02:50, Joe Smith a écrit :
  Now it is useless for users where the bottleneck is on their end.
 
 Well, it can also be usefull in case of a broken mirror can't it?

apt already handles that and skips to the next mirror.

-- 
  One disk to rule them all, One disk to find them. One disk to bring
  them all and in the darkness grind them. In the Land of Redmond
  where the shadows lie. -- The Silicon Valley Tarot
  Henrique Holschuh


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: apt PARALLELISM

2005-12-05 Thread Ivan Adams
Hi again,in my case:I have slow internet connection. BUT I have friends with the same connection in my local area network, who have apt-proxy.My
goal is: When I need to install new system (Debian) on new user, or
dist-upgrade on entire system, I need the unstable packets from site.
In this case I need to wait some HOURS. If apt have PARALLELISM , I could use my connection and at the same time the connections of apt-proxy.In that case I will download the packets twice (or more) faster. 
Thank you!
Best regards