Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-08 Thread Alexander Harrowell


Joe Abley said: (For example, you
might imagine an RSS feed with BitTorrent enclosures, which requires
no human presence to trigger the downloads.)

I think that is essentially the Democracy client I mentioned.

Great thread so far, btw.


Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-08 Thread Michael . Dillon

 But what happens when 5% of the paying subscribers use 95% of the 
existing 
 capacity, and then the other 95% of the subscribers complain about poor 
 performance?

Capacity is too vague of a word here. If we assume that the P2P 
software can be made to recognize the ISP's architecture and prefer
peers that are topologically nearby, then the issue focuses on the 
ISP's own internal capacity. It should not have a major impact on
the ISP's upstream capacity which involves stuff that is rented 
from others (transit, peering). Also, because P2P traffic has its
sources evenly distributed, it makes a case for cheap local
BGP peering connections, again, to offload traffic from more
expensive upstream transit/peering.

  What is the real cost to the ISP needing to upgrade the
 network to handle the additional traffic being generated by 5% of the
 subscribers when there isn't spare capacity?

In the case of DSL/Cable providers, I suspect it is mostly in
the Ethernet switches that tie the subscriber lines into the
network.

 The reason why many universities buy rate-shaping devices is dorm users 
 don't restrain their application usage to only off-peak hours, which may 

 or may not be related to sleeping hours.  If peer-to-peer applications 
 restrained their network usage during periods of peak network usage so 
 it didn't result in complaints from other users, it would probably 
 have a better reputation.

I am suggesting that ISP folks should be cooperating with
P2P software developers. Typically, the developers have a very
vague understanding of how the network is structured and are
essentially trying to reverse engineer network capabilities. 
It should not be too difficult to develop P2P clients that
receive topology hints from their local ISPs. If this results
in faster or more reliable/predictable downloads, then users
will choose to use such a client. 

 The Internet is good for narrowcasting, but its
 still working on mass audience events.

Then, perhaps we should not even try to use the Internet
for mass audience events. Is there something wrong with
the current broadcast model? Did TV replace radio? Did
radio replace newspapers?

--Michael Dillon



Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-08 Thread Mark Smith

On Mon, 8 Jan 2007 10:25:54 +
[EMAIL PROTECTED] wrote:
snip

 
 I am suggesting that ISP folks should be cooperating with
 P2P software developers. Typically, the developers have a very
 vague understanding of how the network is structured and are
 essentially trying to reverse engineer network capabilities. 
 It should not be too difficult to develop P2P clients that
 receive topology hints from their local ISPs. If this results
 in faster or more reliable/predictable downloads, then users
 will choose to use such a client. 
 

I'd think TCP's underlying and constant round trip time measurement to
peers could be used for that. I've wondered if P2P protocols did that
fairly recently, however hadn't found the time to see if it was so.

-- 

Sheep are slow and tasty, and therefore must remain constantly
 alert.
   - Bruce Schneier, Beyond Fear


Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-08 Thread imipak


Brandon Butterworth wrote:

If this application takes off, I have to presume that everyone's
baseline network usage metrics can be tossed out the window...


That'll happen anyway, what used to be considered high volume
content is becoming the norm with lots of start ups and old
school broadcasters getting involved.




Indeed.

http://news.bbc.co.uk/1/hi/technology/6239975.stm :

Microsoft's work in developing IPTV (internet protocol TV), which
allows programmes to be delivered live or on demand over an internet
connection, would soon come to Xbox 360 games consoles. By the end of
2007 partner companies will be offering IPTV services to Xbox 360
owners, he said. 


Hmmm...

http://www.wired.com/wired/archive/6.04/mstv.html


\a
--
Andrew Simmons


Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-08 Thread Marshall Eubanks


Dear Sean;

On Jan 8, 2007, at 2:34 AM, Sean Donelan wrote:



On Sun, 7 Jan 2007, Joe Abley wrote:
Setting aside the issue of what particular ISPs today have to pay,  
the real cost of sending data, best-effort over an existing  
network which has spare capacity and which is already supported  
and managed is surely zero.


As long as the additional traffic doesn't exceed the existing  
capacity.


But what happens when 5% of the paying subscribers use 95% of the  
existing capacity, and then the other 95% of the subscribers  
complain about poor performance?  What is the real cost to the ISP  
needing to upgrade the

network to handle the additional traffic being generated by 5% of the
subscribers when there isn't spare capacity?

If I acquire content while I'm sleeping, during a low dip in my  
ISP's usage profile, the chances good that are nobody incurs more  
costs that month than if I had decided not to acquire it. (For  
example, you might imagine an RSS feed with BitTorrent enclosures,  
which requires no human presence to trigger the downloads.)


The reason why many universities buy rate-shaping devices is dorm  
users don't restrain their application usage to only off-peak  
hours, which may or may not be related to sleeping hours.  If peer- 
to-peer applications restrained their network usage during periods  
of peak network usage so it didn't result in complaints from other  
users, it would probably have a better reputation.




Do not count on demand being geographically localized or limited to  
certain times of day. The audience for streaming is world-wide (for  
an example, see


http://www.americafree.tv/Ads/geographical.html

for a few hour slice in the early evening EST on a Sunday - note,  
BTW, that this is for English language content). The roughly equal  
distribution to the US and the EU is entirely normal;  typically the  
peak-to-trough bandwidth usage variation during a day is less than a  
factor of 2, and frequently it disappears all together.


Regards
Marshall

If I acquire content the same time as many other people, since  
what I'm watching is some coordinated, streaming event, then it  
seems far more likely that the popularity of the content will lead  
to network congestion, or push up a peak on an interface somewhere  
which will lead to a requirement for a circuit upgrade, or affect  
a 95%ile transit cost, or something.


Depends on when and where the replication of the content is taking  
place.


Broadcasting is a very efficient way to distribute the same content  
to large numbers of people, even when some people may watch it  
later.  You can broadcast either streaming or file downloads.  You  
can also unicast either streaming or file downloads. Unicast tends  
to be less efficient to distribute the same content to large  
numbers of people.  Then there is lots of events in the middle.   
Some content is only of interest to a some people.


Streaming vs download and broadcast vs unicast.  There are lots of  
combinations.  One way is not necessarily the best way for every

situation.  Sometimes store-and-forward e-mail is useful, other times
instant messenger communications is useful.  Things may change over  
time.  For example, USENET has mostly stopped being a widely  
flooded through every ISP and large institution, and is now  
accessed on demand by users from a few large aggregators.


Distribution methods aren't mutually exclusive.

If asynchronous delivery of content is as free as I think it is,  
and synchronous delivery of content is as expensive as I suspect  
it might be, it follows that there ought to be more of the former  
than the latter going on.


If it turned out that there was several orders of magnitude more  
content being shifted around the Internet in a download when you  
are able; watch later fashion than there is content being  
streamed to viewers in real-time I would be thoroughly unsurprised.


If you limit yourself to the Internet, you exclude a lot of content
being shifted around and consumed in the world.  The World Cup or  
Superbowl are still much bigger events than Internet-only events.  
Broadcast
television shows with even bottom ratings are still more popular  
than most Internet content.  The Internet is good for  
narrowcasting, but its

still working on mass audience events.

Asynchronous receivers are more expensive and usually more  
complicated
than synchronous receivers.  Not everyone owns a computer or  
spends a
several hundred dollars for a DVR.  If you already own a computer,  
you might consider it free.  But how many people want to buy a  
computer for each television set?  In the USA, Congress debated  
whether it should
spend $40 per digital receiver so people wouldn't lose their over  
the air broadcasting.


Gadgets that interest 5% of the population versus reaching 95% of  
the population may have different trade-offs.







Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-08 Thread Joe Abley



On 8-Jan-2007, at 02:34, Sean Donelan wrote:


On Sun, 7 Jan 2007, Joe Abley wrote:
Setting aside the issue of what particular ISPs today have to pay,  
the real cost of sending data, best-effort over an existing  
network which has spare capacity and which is already supported  
and managed is surely zero.


As long as the additional traffic doesn't exceed the existing  
capacity.


Indeed.

So perhaps we should expect to see distribution price models whose  
success depends on that spare (off-peak, whatever) capacity being  
available being replaced by others which don't.


If that's the case, and assuming the cost benefits of using slack  
capacity continue to be exploited, the bandwidth metrics mentioned in  
the original post might be those which assume a periodic utilisation  
profile, rather than those which just assume that spare bandwidth  
will be used.


(It's still accounting based on peak; the difference might be that in  
the second model there really isn't that much of a peak any more, and  
the effect of that is a bonus window during which existing capacity  
models will sustain the flood.)



If you limit yourself to the Internet, you exclude a lot of content
being shifted around and consumed in the world.  The World Cup or  
Superbowl are still much bigger events than Internet-only events.  
Broadcast
television shows with even bottom ratings are still more popular  
than most Internet content.  The Internet is good for  
narrowcasting, but its

still working on mass audience events.


Ah, but I wasn't comparing internet distribution with cable/satellite/ 
UHF/whatever -- I was comparing content which is streamed with  
content which isn't.


The cost differences between those are fairly well understood, I  
think. Reliable, high-quality streaming media is expensive (ask  
someone like Akamai for a quote), whereas asynchronous delivery of  
content (e.g. through BitTorrent trackers) can result in enormous  
distribution of data with a centralised investment in hardware and  
network which is demonstrably sustainable by voluntary donations.


Asynchronous receivers are more expensive and usually more  
complicated

than synchronous receivers.


Well, there's no main-stream, blessed product which does the kind of  
asynchronous acquisition of content on anything like the scale of  
digital cable terminals; however, that's not to say that one couldn't  
be produced for the same cost. I'd guess that most of those digital  
cable boxes are running linux anyway, which makes it a software problem.


If we're considering a fight between an intelligent network (one  
which can support good-quality, isochronous streaming video at high  
data rates from the producer to the consumer) and a stupid one (which  
concentrates on best-effort distribution of data, asynchronously,  
with a smarter edge) then absent external constraints regarding  
copyright, digital rights, etc, I presume we'd expect the stupid  
network model to win. Eventually.



  Not everyone owns a computer or spends a
several hundred dollars for a DVR.  If you already own a computer,  
you might consider it free.


Since I was comparing two methods of distributing material over the  
Internet, the availability of a computer is more or less a given. I'm  
not aware of a noticeable population of broadband users who don't own  
a computer, for example (apart from those who are broadband users  
without noticing, e.g. through a digital cable terminal which talks  
IP to the network).



Joe



RE: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-08 Thread Bora Akyol

That's because most of these people are watching the stream
on their computer (Mac or PC).

Bring that box to the living room in an attractive package and
the stats will be very different.

 

 -Original Message-
 From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On 
 Behalf Of Marshall Eubanks
 Sent: Saturday, January 06, 2007 7:45 AM
 To: [EMAIL PROTECTED]
 Cc: Andrew Odlyzko; nanog@merit.edu
 Subject: Re: Network end users to pull down 2 gigabytes a 
 day, continuously?
 
 
 
 On Jan 6, 2007, at 10:19 AM, Colm MacCarthaigh wrote:
 
 
  On Sat, Jan 06, 2007 at 09:09:19AM -0600, Andrew Odlyzko wrote:
  2.  The question I don't understand is, why stream?
 
  There are other good reasons, but fundamentally; because of live
  telivision.
 
  In these days, when a terabyte disk for consumer PCs is about to be
  introduced, why bother with streaming?  It is so much simpler to
  download (at faster than real-time rates, if possible), and play it
  back.
 
  That might be worse for download operators, because people may  
  download
  an hour of video, and only watch 5 minutes :/
 
 
 Our logs show that, for every 100 people who start to watch a 
 stream,  
 only 2 or 5 % watch over
 30 minutes in one sitting, even for VOD where they presumably have  
 some interest in the movie up front, and
 more more than 9% will watch all of VOD movie, even over multiple  
 viewings. This is also very consistent
 with time, but I don't have any pretty plots handy. (Our cumulative  
 audience in 2006 was 2.74 million people, I have lots of statistics.)
 
 So, from that standpoint, making a video file available for download  
 is wasting order of 90% of the bandwidth used
 to download it.
 
 Regards
 Marshall
 
 
  -- 
  Colm MacCárthaighPublic Key: colm 
  [EMAIL PROTECTED]
 
 
 



Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-08 Thread Al Iverson


On 1/8/07, Bora Akyol [EMAIL PROTECTED] wrote:


That's because most of these people are watching the stream
on their computer (Mac or PC).

Bring that box to the living room in an attractive package and
the stats will be very different.


This isn't part of the same project, but I suspect this will more or
less bring a video stream from a Slingbox (in the other room or
halfway around the earth) to the living room:

http://www.zatznotfunny.com/2007-01/slingcatcher-is-real/

Regards,
Al Iverson
--
Al Iverson -- www.aliverson.com
Visit my blog: www.spamresource.com
This is my list address. Remove lists from the email address to
reach me faster.
The contents of this message are copyrighted by Al Iverson. Permission
is denied to
archivesat.com to archive, store, share, or otherwise reproduce the
contents of this message.


Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-08 Thread Marshall Eubanks


I'm working on it.

On Jan 8, 2007, at 3:29 PM, Bora Akyol wrote:


That's because most of these people are watching the stream
on their computer (Mac or PC).

Bring that box to the living room in an attractive package and
the stats will be very different.




-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On
Behalf Of Marshall Eubanks
Sent: Saturday, January 06, 2007 7:45 AM
To: [EMAIL PROTECTED]
Cc: Andrew Odlyzko; nanog@merit.edu
Subject: Re: Network end users to pull down 2 gigabytes a
day, continuously?



On Jan 6, 2007, at 10:19 AM, Colm MacCarthaigh wrote:



On Sat, Jan 06, 2007 at 09:09:19AM -0600, Andrew Odlyzko wrote:

2.  The question I don't understand is, why stream?


There are other good reasons, but fundamentally; because of live
telivision.


In these days, when a terabyte disk for consumer PCs is about to be
introduced, why bother with streaming?  It is so much simpler to
download (at faster than real-time rates, if possible), and play it
back.


That might be worse for download operators, because people may
download
an hour of video, and only watch 5 minutes :/



Our logs show that, for every 100 people who start to watch a
stream,
only 2 or 5 % watch over
30 minutes in one sitting, even for VOD where they presumably have
some interest in the movie up front, and
more more than 9% will watch all of VOD movie, even over multiple
viewings. This is also very consistent
with time, but I don't have any pretty plots handy. (Our cumulative
audience in 2006 was 2.74 million people, I have lots of statistics.)

So, from that standpoint, making a video file available for download
is wasting order of 90% of the bandwidth used
to download it.

Regards
Marshall



--
Colm MacCárthaighPublic Key: colm
[EMAIL PROTECTED]










RE: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-08 Thread Bora Akyol

 

 -Original Message-
 From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On 
 Behalf Of Gian Constantine
 Sent: Sunday, January 07, 2007 7:18 PM
 To: nanog@merit.edu
 Subject: Re: Network end users to pull down 2 gigabytes a 
 day, continuously?
 
snip
 In entertainment, content is king. More specifically, new 
 release content is king. While internet distribution may help 
 breathe life into the long tail market, it is hard to imagine 
 any major shift from existing distribution methods. People 
 simply like the latest TV shows and the latest movies.

What's new to you is very different from what's new to me?

I am very happy watching 1 year old episodes of Top Gear whereas
if you are located in the UK, you may consider this as old news.

The story here is about the cost of storing the video content (which
is asymptotically zero) and the cost of distributing it (which is also
asymptotically
approaching zero, despite the ire of the SPs).


 So, this leaves us with little more than what is already 
 offered by the MSOs: linear TV and VoD. This is where things 
 become complex.
 
 The studios will never (not any time soon) allow for a 
 subscription based VoD on new content. They would instantly 
 be sued by Time Warner (HBO). 

This is a very US-centric view of the world. I am sure there are
hundreds of
TV stations from India, Turkey, Greece, etc that would love to put their
content
online and make money off the long tail.

 I guess where I am going with all this is simply it is very 
 hard to make this work from a business and marketing side. 
 The network constraints are, likely, a minor issue for some 
 time to come. Interest is low in the public at large for 
 primary (or even major secondary) video service on the PC.
 

Again, your views are very US centric, and are mono-cultural.

If you open your horizons, I think there is a world of content out there
that the content owners would be happy to license and sell at  10 cents
a pop.
To them it is dead content, but it turns out that they are worth
something to someone out there.
This is what iTunes, and Rhapsody are doing with music. And the day of
the video is coming.

Bora

-- Off to raise some venture funds now. (Just kidding ;)



Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-08 Thread Gian Constantine
Well, yes. My view on this subject is U.S.-centric. In fairness to  
me, this is NANOG, not AFNOG or EuroNOG or SANOG.


I would also argue storage and distribution costs are not  
asymptotically zero with scale. Well designed SANs are not cheap.  
Well designed distribution systems are not cheap. While price does  
decrease when scaled upwards, the cost of such an operation remains  
hefty, and increases with additions to the offered content library  
and a swelling of demand for this content. I believe the graph  
becomes neither asymptotic, nor anywhere near zero.


You are correct on the long tail nature of music. But music is not  
consumed in a similar manner as TV and movies. Television and movies  
involve a little more commitment and attention. Music is more for the  
moment and the mood. There is an immediacy with music consumption.  
Movies and television require a slight degree more patience from the  
consumer. The freshness (debatable :-) ) of new release movies and TV  
can often command the required patience from the consumer. Older  
content rarely has the same pull.


I agree there is a market for ethnic and niche content, but it is not  
the broad market many companies look for. The investment becomes much  
more of a gamble than marketing the latest and greatest (again  
debatable :-) ) to the larger market of...well...everyone.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.
Office: 404-748-6207
Cell: 404-808-4651
Internal Ext: x22007
[EMAIL PROTECTED]



On Jan 8, 2007, at 5:15 PM, Bora Akyol wrote:






-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On
Behalf Of Gian Constantine
Sent: Sunday, January 07, 2007 7:18 PM
To: nanog@merit.edu
Subject: Re: Network end users to pull down 2 gigabytes a
day, continuously?


snip

In entertainment, content is king. More specifically, new
release content is king. While internet distribution may help
breathe life into the long tail market, it is hard to imagine
any major shift from existing distribution methods. People
simply like the latest TV shows and the latest movies.


What's new to you is very different from what's new to me?

I am very happy watching 1 year old episodes of Top Gear whereas
if you are located in the UK, you may consider this as old news.

The story here is about the cost of storing the video content (which
is asymptotically zero) and the cost of distributing it (which is also
asymptotically
approaching zero, despite the ire of the SPs).



So, this leaves us with little more than what is already
offered by the MSOs: linear TV and VoD. This is where things
become complex.

The studios will never (not any time soon) allow for a
subscription based VoD on new content. They would instantly
be sued by Time Warner (HBO).


This is a very US-centric view of the world. I am sure there are
hundreds of
TV stations from India, Turkey, Greece, etc that would love to put  
their

content
online and make money off the long tail.


I guess where I am going with all this is simply it is very
hard to make this work from a business and marketing side.
The network constraints are, likely, a minor issue for some
time to come. Interest is low in the public at large for
primary (or even major secondary) video service on the PC.



Again, your views are very US centric, and are mono-cultural.

If you open your horizons, I think there is a world of content out  
there
that the content owners would be happy to license and sell at  10  
cents

a pop.
To them it is dead content, but it turns out that they are worth
something to someone out there.
This is what iTunes, and Rhapsody are doing with music. And the day of
the video is coming.

Bora

-- Off to raise some venture funds now. (Just kidding ;)





RE: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-08 Thread Bora Akyol

Please see my comments inline:

 -Original Message-
 From: Gian Constantine [mailto:[EMAIL PROTECTED] 
 Sent: Monday, January 08, 2007 4:27 PM
 To: Bora Akyol
 Cc: nanog@merit.edu
 Subject: Re: Network end users to pull down 2 gigabytes a 
 day, continuously?
 
snip
 
 I would also argue storage and distribution costs are not 
 asymptotically zero with scale. Well designed SANs are not 
 cheap. Well designed distribution systems are not cheap. 
 While price does decrease when scaled upwards, the cost of 
 such an operation remains hefty, and increases with additions 
 to the offered content library and a swelling of demand for 
 this content. I believe the graph becomes neither asymptotic, 
 nor anywhere near zero.

To the end user, there is no cost to downloading videos when they are
sleeping.
I would argue that other than sports (and some news) events, there is
pretty much no content that
needs to be real time. What the downloading (possibly 24x7) does is to 
stress the ISP network to its max since the assumptions of statistical
multiplexing
goes out the window. Think of a Tivo that downloads content off the
Internet
24x7. 

The user is still paying for only what they pay each month, and this is
network neutrality 2.0 all over again.


 You are correct on the long tail nature of music. But music 
 is not consumed in a similar manner as TV and movies. 
 Television and movies involve a little more commitment and 
 attention. Music is more for the moment and the mood. There 
 is an immediacy with music consumption. Movies and television 
 require a slight degree more patience from the consumer. The 
 freshness (debatable :-) ) of new release movies and TV can 
 often command the required patience from the consumer. Older 
 content rarely has the same pull.

I would argue against your distinction between visual and auditory
content.
There is a lot of content out there that a lot of people watch and the
content
is 20-40+ years old. Think Brady Bunch, Bonanza, or archived games from
NFL,
MLB etc. What about Smurfs (for those of us with kids)?

This is only the beginning.

If I can get a 500GB box and download MP4 content, that's a lot of
essentially free storage.

Coming back to NANOG content, I think video (not streamed but multi-path
distributed video) is going to bring the networks down not by sheer
bandwidth alone but by challenging the assumptions behind the
engineering of the network. I don't think you need huge SANs per se to
store the content either, since it is multi-source/multi-sink, the
reliability is built-in.

The SPs like Verizon  ATT moving fiber to the home hoping to get in on
the value add action are in for an awakening IMHO.

Regards

Bora
ps. I apologize for the tone of my previous email. That sounded grumpier
than I usually am.




Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-08 Thread Simon Lyall

On Mon, 8 Jan 2007, Gian Constantine wrote:
 I would also argue storage and distribution costs are not
 asymptotically zero with scale. Well designed SANs are not cheap.
 Well designed distribution systems are not cheap. While price does
 decrease when scaled upwards, the cost of such an operation remains
 hefty, and increases with additions to the offered content library
 and a swelling of demand for this content. I believe the graph
 becomes neither asymptotic, nor anywhere near zero.

Lets see what I can do using today's technology:

According to the itunes website they have over 3.5 million songs. Lets
call it 4 million. Assume a decent bit rate and make them average 10 MB
each. That's 40 TB which would cost me $6k per month to store on Amazon
S3. Lets assume we use Amazon EC3 to only allow torrents of the files to
be downloaded and we transfer each file twice per month. Total cost around
$20k per month or $250k per year. Add $10k to pay somebody to create the
interface and put up a few banner ads and it'll be self supporting.

That sort of setup could come out of petty cash for larger ISPs marketing
Departments.

Of course there are a few problems with the above business model (mostly
legal) but infrastructure costs are not one of them. Plug in your own
numbers for movies and tv shows but 40 TB for each will probably be enough.

-- 
Simon J. Lyall  |  Very Busy  |  Web: http://www.darkmere.gen.nz/
To stay awake all night adds a day to your life - Stilgar | eMT.



Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-08 Thread Gian Constantine
There may have been a disconnect on my part, or at least, a failure  
to disclose my position. I am looking at things from a provider  
standpoint, whether as an ISP or a strict video service provider.


I agree with you. From a consumer standpoint, a trickle or off-peak  
download model is the ideal low-impact solution to content delivery.  
And absolutely, a 500GB drive would almost be overkill on space for  
disposable content encoded in H.264. Excellent SD (480i) content can  
be achieved at ~1200 to 1500kbps, resulting in about a 1GB file for a  
90 minute title. HD is almost out of the question for internet  
download, given good 720p at ~5500kbps, resulting in a 30GB file for  
a 90 minute title.


Service providers wishing to provide this service to their customers  
may see some success where they control the access medium (copper  
loop, coax, FTTH). Offering such a service to customers outside of  
this scope would prove very expensive, and likely, would never see a  
return on the investment without extensive peering arrangements. Even  
then, distribution rights would be very difficult to attain without  
very deep pockets and crippling revenue sharing. The studios really  
dislike the idea of transmission outside of a closed network. Don't  
forget. Even the titles you mentioned are still owned by very large  
companies interested in squeezing every possible dime from their  
assets. They would not be cheap to acquire.


Further, torrent-like distribution is a long long way away from sign  
off by the content providers. They see torrents as the number one  
tool of content piracy. This is a major reason I see the discussion  
of tripping upstream usage limits through content distribution as moot.


I am with you on the vision of massive content libraries at the  
fingertips of all, but I see many roadblocks in the way. And, almost  
none of them are technical in nature.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.
Office: 404-748-6207
Cell: 404-808-4651
Internal Ext: x22007
[EMAIL PROTECTED]



On Jan 8, 2007, at 7:51 PM, Bora Akyol wrote:



Please see my comments inline:


-Original Message-
From: Gian Constantine [mailto:[EMAIL PROTECTED]
Sent: Monday, January 08, 2007 4:27 PM
To: Bora Akyol
Cc: nanog@merit.edu
Subject: Re: Network end users to pull down 2 gigabytes a
day, continuously?


snip


I would also argue storage and distribution costs are not
asymptotically zero with scale. Well designed SANs are not
cheap. Well designed distribution systems are not cheap.
While price does decrease when scaled upwards, the cost of
such an operation remains hefty, and increases with additions
to the offered content library and a swelling of demand for
this content. I believe the graph becomes neither asymptotic,
nor anywhere near zero.


To the end user, there is no cost to downloading videos when they are
sleeping.
I would argue that other than sports (and some news) events, there is
pretty much no content that
needs to be real time. What the downloading (possibly 24x7) does is to
stress the ISP network to its max since the assumptions of statistical
multiplexing
goes out the window. Think of a Tivo that downloads content off the
Internet
24x7.

The user is still paying for only what they pay each month, and  
this is

network neutrality 2.0 all over again.



You are correct on the long tail nature of music. But music
is not consumed in a similar manner as TV and movies.
Television and movies involve a little more commitment and
attention. Music is more for the moment and the mood. There
is an immediacy with music consumption. Movies and television
require a slight degree more patience from the consumer. The
freshness (debatable :-) ) of new release movies and TV can
often command the required patience from the consumer. Older
content rarely has the same pull.


I would argue against your distinction between visual and auditory
content.
There is a lot of content out there that a lot of people watch and the
content
is 20-40+ years old. Think Brady Bunch, Bonanza, or archived games  
from

NFL,
MLB etc. What about Smurfs (for those of us with kids)?

This is only the beginning.

If I can get a 500GB box and download MP4 content, that's a lot of
essentially free storage.

Coming back to NANOG content, I think video (not streamed but multi- 
path

distributed video) is going to bring the networks down not by sheer
bandwidth alone but by challenging the assumptions behind the
engineering of the network. I don't think you need huge SANs per se to
store the content either, since it is multi-source/multi-sink, the
reliability is built-in.

The SPs like Verizon  ATT moving fiber to the home hoping to get  
in on

the value add action are in for an awakening IMHO.

Regards

Bora
ps. I apologize for the tone of my previous email. That sounded  
grumpier

than I usually am.






Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-08 Thread Thomas Leavitt
So, kind of back to the original question: what is going to be the 
reaction of your average service provider to the presence of an 
increasing number of people sucking down massive amounts of video and 
spitting it back out again... nothing? throttling all traffic of a 
certain type? shutting down customers who exceed certain thresholds? or 
just throttling their traffic? massive upgrades of internal network 
hardware?


Is it your contention that there's no economic model, given the 
architecture of current networks, which would would generate enough 
revenue to offset the cost of traffic generated by P2P video?


Thomas

Gian Constantine wrote:
There may have been a disconnect on my part, or at least, a failure to 
disclose my position. I am looking at things from a provider 
standpoint, whether as an ISP or a strict video service provider.


I agree with you. From a consumer standpoint, a trickle or off-peak 
download model is the ideal low-impact solution to content delivery. 
And absolutely, a 500GB drive would almost be overkill on space for 
disposable content encoded in H.264. Excellent SD (480i) content can 
be achieved at ~1200 to 1500kbps, resulting in about a 1GB file for a 
90 minute title. HD is almost out of the question for internet 
download, given good 720p at ~5500kbps, resulting in a 30GB file for a 
90 minute title.


Service providers wishing to provide this service to their customers 
may see some success where they control the access medium (copper 
loop, coax, FTTH). Offering such a service to customers outside of 
this scope would prove very expensive, and likely, would never see a 
return on the investment without extensive peering arrangements. Even 
then, distribution rights would be very difficult to attain without 
very deep pockets and crippling revenue sharing. The studios really 
dislike the idea of transmission outside of a closed network. Don't 
forget. Even the titles you mentioned are still owned by very large 
companies interested in squeezing every possible dime from their 
assets. They would not be cheap to acquire.


Further, torrent-like distribution is a long long way away from sign 
off by the content providers. They see torrents as the number one tool 
of content piracy. This is a major reason I see the discussion of 
tripping upstream usage limits through content distribution as moot.


I am with you on the vision of massive content libraries at the 
fingertips of all, but I see many roadblocks in the way. And, almost 
none of them are technical in nature.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.
Office: 404-748-6207
Cell: 404-808-4651
Internal Ext: x22007
[EMAIL PROTECTED] mailto:[EMAIL PROTECTED]



On Jan 8, 2007, at 7:51 PM, Bora Akyol wrote:



Please see my comments inline:


-Original Message-
From: Gian Constantine [mailto:[EMAIL PROTECTED] 
Sent: Monday, January 08, 2007 4:27 PM

To: Bora Akyol
Cc: nanog@merit.edu mailto:nanog@merit.edu
Subject: Re: Network end users to pull down 2 gigabytes a 
day, continuously?



snip


I would also argue storage and distribution costs are not 
asymptotically zero with scale. Well designed SANs are not 
cheap. Well designed distribution systems are not cheap. 
While price does decrease when scaled upwards, the cost of 
such an operation remains hefty, and increases with additions 
to the offered content library and a swelling of demand for 
this content. I believe the graph becomes neither asymptotic, 
nor anywhere near zero.


To the end user, there is no cost to downloading videos when they are
sleeping.
I would argue that other than sports (and some news) events, there is
pretty much no content that
needs to be real time. What the downloading (possibly 24x7) does is to 
stress the ISP network to its max since the assumptions of statistical

multiplexing
goes out the window. Think of a Tivo that downloads content off the
Internet
24x7. 


The user is still paying for only what they pay each month, and this is
network neutrality 2.0 all over again.


You are correct on the long tail nature of music. But music 
is not consumed in a similar manner as TV and movies. 
Television and movies involve a little more commitment and 
attention. Music is more for the moment and the mood. There 
is an immediacy with music consumption. Movies and television 
require a slight degree more patience from the consumer. The 
freshness (debatable :-) ) of new release movies and TV can 
often command the required patience from the consumer. Older 
content rarely has the same pull.


I would argue against your distinction between visual and auditory
content.
There is a lot of content out there that a lot of people watch and the
content
is 20-40+ years old. Think Brady Bunch, Bonanza, or archived games from
NFL,
MLB etc. What about Smurfs (for those of us with kids)?

This is only the beginning.

If I can get a 500GB box and download MP4 content, that's a lot of
essentially free storage.

Coming 

Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-08 Thread Gian Constantine
My contention is simple. The content providers will not allow P2P  
video as a legal commercial service anytime in the near future.  
Furthermore, most ISPs are going to side with the content providers  
on this one. Therefore, discussing it at this point in time is purely  
academic, or more so, diversionary.


Personally, I am not one for throttling high use subscribers. Outside  
of the fine print, which no one reads, they were sold a service of  
Xkbps down and Ykbps up. I could not care less how, when, or how  
often they use it. If you paid for it, burn it up.


I have questions as to whether or not P2P video is really a smart  
distribution method for service provider who controls the access  
medium. Outside of being a service provider, I think the economic  
model is weak, when there can be little expectation of a large scale  
take rate.


Ultimately, my answer is: we're not there yet. The infrastructure  
isn't there. The content providers aren't there. The market isn't  
there. The product needs a motivator. This discussion has been  
putting the cart before the horse.


A lot of big pictures pieces are completely overlooked. We fail to  
question whether or not P2P sharing is a good method in delivering  
the product. There are a lot of factors which play into this.  
Unfortunately, more interest has been paid to the details of this  
delivery method than has been paid to whether or not the method is  
even worthwhile.


From a big picture standpoint, I would say P2P distribution is a non- 
starter, too many reluctant parties to appease. From a detail  
standpoint, I would say P2P distribution faces too many hurdles in  
existing network infrastructure to be justified. Simply reference the  
discussion of upstream bandwidth caps and you will have a wonderful  
example of those hurdles.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.


On Jan 8, 2007, at 9:49 PM, Thomas Leavitt wrote:

So, kind of back to the original question: what is going to be the  
reaction of your average service provider to the presence of an  
increasing number of people sucking down massive amounts of video  
and spitting it back out again... nothing? throttling all traffic  
of a certain type? shutting down customers who exceed certain  
thresholds? or just throttling their traffic? massive upgrades of  
internal network hardware?


Is it your contention that there's no economic model, given the  
architecture of current networks, which would would generate enough  
revenue to offset the cost of traffic generated by P2P video?


Thomas

Gian Constantine wrote:
There may have been a disconnect on my part, or at least, a  
failure to disclose my position. I am looking at things from a  
provider standpoint, whether as an ISP or a strict video service  
provider.


I agree with you. From a consumer standpoint, a trickle or off- 
peak download model is the ideal low-impact solution to content  
delivery. And absolutely, a 500GB drive would almost be overkill  
on space for disposable content encoded in H.264. Excellent SD  
(480i) content can be achieved at ~1200 to 1500kbps, resulting in  
about a 1GB file for a 90 minute title. HD is almost out of the  
question for internet download, given good 720p at ~5500kbps,  
resulting in a 30GB file for a 90 minute title.


Service providers wishing to provide this service to their  
customers may see some success where they control the access  
medium (copper loop, coax, FTTH). Offering such a service to  
customers outside of this scope would prove very expensive, and  
likely, would never see a return on the investment without  
extensive peering arrangements. Even then, distribution rights  
would be very difficult to attain without very deep pockets and  
crippling revenue sharing. The studios really dislike the idea of  
transmission outside of a closed network. Don't forget. Even the  
titles you mentioned are still owned by very large companies  
interested in squeezing every possible dime from their assets.  
They would not be cheap to acquire.


Further, torrent-like distribution is a long long way away from  
sign off by the content providers. They see torrents as the number  
one tool of content piracy. This is a major reason I see the  
discussion of tripping upstream usage limits through content  
distribution as moot.


I am with you on the vision of massive content libraries at the  
fingertips of all, but I see many roadblocks in the way. And,  
almost none of them are technical in nature.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.
Office: 404-748-6207
Cell: 404-808-4651
Internal Ext: x22007
[EMAIL PROTECTED]  
mailto:[EMAIL PROTECTED]




On Jan 8, 2007, at 7:51 PM, Bora Akyol wrote:



Please see my comments inline:


-Original Message-
From: Gian Constantine [mailto:[EMAIL PROTECTED]  
Sent: Monday, January 08, 2007 4:27 PM

To: Bora Akyol
Cc: nanog@merit.edu mailto:nanog@merit.edu
Subject: Re: 

RE: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-08 Thread Michael . Dillon

 Bring that box to the living room in an attractive package and
 the stats will be very different.

This kind of box is very popular in England. It 
is called a digital TV receiver and it receives
MPEG-2 streams broadcast freely over the airwaves.
Some people, myself included, have a receiver that
with a hard disk that allows pausing live TV and 
scheduling recording from the electronic program
guide which is part of the broadcast stream.

Given that the broadcast model for streaming content
is so successful, why would you want to use the
Internet for it? What is the benefit?

--Michael Dillon