Re: Using multicast for security updates

2006-02-24 Thread aliban
Geoff Crompton schrieb:

Edward Faulkner wrote:
  

Or you could just use bittorrent.  The server runs a tracker and
everyone cooperatively downloads chunks.  Same kind of idea, but it
doesn't require multicast support (which may or may not exist in
various networks).



When you say The server runs a tracker, are you explaining bittorrent,
or do the security.debian.org servers actually run a tracker at the moment?

How well does bittorrent work for smaller files? I was always under the
(possibly mistaken) impression that it worked really well for iso sized
images (and larger) because the size allowed plenty of time for the
chunks to get distributed well through out the network.

  

Should be highly available considering that every client would
probably need the security updates and then would/should
host them.

Whatever I doubt that many users want to host files for others...?


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Using multicast for security updates

2006-02-23 Thread npmrphy
Has this concept been considered?

Instead of having all users connect and DL their own copies of security updates 
(which requires tremendous bandwidth), would it be possible to use multicast to 
'broadcast' the updates. The thought is that updates could be distributed 
without saturating the server's link(s).

Assume (for discussion purposes) that users have 3Mbps DL speeds. A 
client/server program would be written that uses both multicast and TCP 
connections. The client connects to the server with a standard TCP connection 
and tunes into the multicast stream. The client records which chunks of each 
package it has received; more to the point, it records which chunks it misses. 
The client stays tuned into the mcast stream until it comes around again to 
where it first tuned in.

In a perfect world, the client would now have all the security updates. But 
since the internet world isn't perfect, there may well be lost chunks, which 
will need to be re-transmitted. Hence the TCP connection. As the client notices 
missing chunks, it sends a request for retransmission to the server, which 
coordinates all these requests and retransmits the packet out the mcast stream. 
Perhaps it could be smart: if it has more than 50 requests for a missing chunk, 
it could mcast that chunk. Fewer than 50? It sends them through the TCP 
connection.

Given a 3Mbps constraint, the mcast stream could be limited to 1.5Mbps, leaving 
reasonable room for re-transmitting chunks as needed. One could take the chunk 
concept one step further, such that each transmitted packet contains the 
current chunk and the previous chunk.

To take the concept one step further, the server could source multiple mcast 
streams: 20Mb for well-connected servers, 3Mb for cable, and 1.5Mb for DSL. And 
maybe even a 40Kb stream for dialup.

If one wanted to refine the process, the client could inform the server as to 
which packages it wants. The server would prioritize and aggregate all the 
requests. The package with the most requests would be broadcast next, unless 
the oldest request is more than, say, 20 minutes old. Or, multicasting could be 
used only for packages larger than some minimum size, or for the 'most popular' 
packages.

I dreamed up this scheme as a means of minimizing the bandwidth required to 
distribute security updates. Overall, users should receive their updates in 
much less time even though some users might have to wait a little longer if 
they only need a few packages. This scheme could even be used to distribute 
packages on the main server(s), not just security updates.

So what have I overlooked?

Neal



-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: Using multicast for security updates

2006-02-23 Thread Michael Loftis
Good idea except this requires large scale rollout of mutlicast, which 
AFAIK, hasn't happened.



--
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: Using multicast for security updates

2006-02-23 Thread Neal Murphy
On Thursday 23 February 2006 17:05, Michael Loftis wrote:
 Good idea except this requires large scale rollout of mutlicast, which
 AFAIK, hasn't happened.

I thought it had progressed further than being a curiosity. Is its current 
scale enough to make a difference?

N


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: Using multicast for security updates

2006-02-23 Thread Henrique de Moraes Holschuh
On Thu, 23 Feb 2006, Neal Murphy wrote:
 On Thursday 23 February 2006 17:05, Michael Loftis wrote:
  Good idea except this requires large scale rollout of mutlicast, which
  AFAIK, hasn't happened.
 
 I thought it had progressed further than being a curiosity. Is its current 
 scale enough to make a difference?

No, at least not in the Internet.  Multicast file transfers for security
updates could be useful in a large organization, but only if they have slow
WAN links.

-- 
  One disk to rule them all, One disk to find them. One disk to bring
  them all and in the darkness grind them. In the Land of Redmond
  where the shadows lie. -- The Silicon Valley Tarot
  Henrique Holschuh


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: Using multicast for security updates

2006-02-23 Thread Daniel Sterling
Interesting, indeed. Looks like multicast is available on some networks:
http://www.multicasttech.com/status/mbgp.sum

But the best place to ask this type of question might be the
debian-admin or debian-mirrors mailing list.
 
[EMAIL PROTECTED] wrote:

Has this concept been considered?

Instead of having all users connect and DL their own copies of security 
updates (which requires tremendous bandwidth), would it be possible to use 
multicast to 'broadcast' the updates. The thought is that updates could be 
distributed without saturating the server's link(s).
  



-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: Using multicast for security updates

2006-02-23 Thread Edward Faulkner
On Thu, Feb 23, 2006 at 04:40:38PM -0500, [EMAIL PROTECTED] wrote:
 Instead of having all users connect and DL their own copies of
 security updates (which requires tremendous bandwidth), would it be
 possible to use multicast to 'broadcast' the updates. The thought is
 that updates could be distributed without saturating the server's
 link(s).

Or you could just use bittorrent.  The server runs a tracker and
everyone cooperatively downloads chunks.  Same kind of idea, but it
doesn't require multicast support (which may or may not exist in
various networks).


signature.asc
Description: Digital signature


Re: Using multicast for security updates

2006-02-23 Thread Geoff Crompton
Edward Faulkner wrote:
 Or you could just use bittorrent.  The server runs a tracker and
 everyone cooperatively downloads chunks.  Same kind of idea, but it
 doesn't require multicast support (which may or may not exist in
 various networks).

When you say The server runs a tracker, are you explaining bittorrent,
or do the security.debian.org servers actually run a tracker at the moment?

How well does bittorrent work for smaller files? I was always under the
(possibly mistaken) impression that it worked really well for iso sized
images (and larger) because the size allowed plenty of time for the
chunks to get distributed well through out the network.

-- 
Geoff Crompton
Debian System Administrator
Strategic Data
+61 3 9340 9000


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: Using multicast for security updates

2006-02-23 Thread Edward Faulkner
On Fri, Feb 24, 2006 at 11:13:35AM +1100, Geoff Crompton wrote:
 When you say The server runs a tracker, are you explaining bittorrent,
 or do the security.debian.org servers actually run a tracker at the moment?

I was just explaining bittorrent.  Sorry for the confusion.

 How well does bittorrent work for smaller files? I was always under the
 (possibly mistaken) impression that it worked really well for iso sized
 images (and larger) because the size allowed plenty of time for the
 chunks to get distributed well through out the network.

You're probably right.  If the files are too small, the overhead
dominates.


signature.asc
Description: Digital signature


Re: Using multicast for security updates

2006-02-23 Thread Chris Evans


On Feb 23, 2006, at 4:22 PM, Edward Faulkner wrote:


On Fri, Feb 24, 2006 at 11:13:35AM +1100, Geoff Crompton wrote:
When you say The server runs a tracker, are you explaining  
bittorrent,
or do the security.debian.org servers actually run a tracker at  
the moment?


I was just explaining bittorrent.  Sorry for the confusion.

How well does bittorrent work for smaller files? I was always  
under the
(possibly mistaken) impression that it worked really well for iso  
sized

images (and larger) because the size allowed plenty of time for the
chunks to get distributed well through out the network.


You're probably right.  If the files are too small, the overhead
dominates.


FYI, there is an apt-torrent program, googled and located at http:// 
sianka.free.fr/


Since we are talking of writing new software anyways, either for  
bittorrent or multicast, instead of a torrent for each individual  
package, there could be a torrent file for the entire current  
archive.  Perhaps in addition to the Packages and Release files, a  
torrent file could be that mentions the entire archive.



The question becomes, is this system used to install software, or  
mirror it.  Mirrors would enjoy this, but would mean changing the  
debian mirroring infrastructure. For installations, in becomes more  
difficult I believe.


The way I see apt-get work, is it queries a server for a copy of a  
file, waits for it to arrive, and then get the next.  This would be  
bad download performance in a bittorrent environment.  If apt could  
request the local daemon for all the files it wants, and get them in  
any order.



*Idea*
If the bittorrent source of the apt was the first to be queried, if  
it did not have the package yet, it would 404, allowing apt to go  
onto the next source.  BUT, it would then note the name of the  
package requested, and start to always try and have the most recent  
copy of that package around.  Therefore if the following week, and  
update was released, the bittorrent source would notice a new version  
of the package available, and download it.



--
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]