Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-21 Thread Alexander Harrowell

Said Sprunk:

Caching per se doesn't apply to P2P networks, since they already do that


as part of their normal operation.  The key is getting users to contact
peers who are topologically closer, limiting the bits * distance
product.  It's ridiculous that I often get better transfer rates with
peers in Europe than with ones a few miles away.  The key to making
things more efficient is not to limit the bandwidth to/from the customer
premise, but limit it leaving the POP and between ISPs.  If I can
transfer at 100kB/s from my neighbors but only 10kB/s from another
continent, my opportunistic client will naturally do what my ISP wants
as a side effect.

The second step, after you've relocated the rate limiting points, is for
ISPs to add their own peers in each POP.  Edge devices would passively
detect when more than N customers have accessed the same torrent, and
they'd signal the ISP's peer to add them to its list.  That peer would
then download the content, and those N customers would get it from the
ISP's peer.  Creative use of rate limits and acess control could make it
even more efficient, but they're not strictly necessary.



Good thinking. Where do I sign? Regarding your first point, it's really
surprising that existing P2P applications don't include topology awareness.
After all, the underlying TCP already has mechanisms to perceive the
relative nearness of a network entity - counting hops or round-trip latency.
Imagine a BT-like client that searches for available torrents, and records
the round-trip time to each host it contacts. These it places in a lookup
table and picks the fastest responders to initiate the data transfer. Those
are likely to be the closest, if not in distance then topologically, and the
ones with the most bandwidth. Further, imagine that it caches the search -
so when you next seek a file, it checks for it first on the hosts nearest to
it in its routing table, stepping down progressively if it's not there.
It's a form of local-pref.

The third step is for content producers to directly add their torrents

to the ISP peers before releasing the torrent directly to the public.
This gets official content pre-positioned for efficient distribution,
making it perform better (from a user's perspective) than pirated
content.

The two great things about this are (a) it doesn't require _any_ changes
to existing clients or protocols since it exploits existing behavior,
and (b) it doesn't need to cover 100% of the content or be 100%
reliable, since if a local peer isn't found with the torrent, the
clients will fall back to their existing behavior (albeit with lower
performance).



Importantly, this option makes it perform better without making everyone
else's perform worse, a big difference to a lot of proposed QOS schemes.
This non-evilness is much to be preferred. Further, it also makes use of the
Zipf behaviour discussed upthread - if 20 per cent of the content and 20 per
cent of the users eat 80 per cent of the bandwidth, forward-deploying that
20 per cent of the content will save 80 per cent of the inter-provider
bandwidth (which is what we care about, right, 'cos we're paying for it).



One thing that _does_ potentially break existing clients is forcing all
of the tracker (including DHT) requests through an ISP server.  The ISP
could then collect torrent popularity data in one place, but more
importantly it could (a) forward the request upstream, replacing the IP
with its own peer, and (b) only inform clients of other peers (including
the ISP one) using the same intercept point.  This looks a lot more like
a traditional transparent cache, with the attendant reliability and
capacity concerns, but I wouldn't be surprised if this were the first
mechanism to make it to market.



It's a nice idea to collect popularity data at the ISP level, because the
decision on what to load into the local torrent servers could be automated.
Once torrent X reaches a certain trigger level of popularity, the local
server grabs it and begins serving, and the local-pref function on the
clients finds it. Meanwhile, we drink coffee. However, it's a potential DOS
magnet - after all, P2P is really a botnet with a badge. And the point of a
topology-aware P2P client is that it seeks the nearest host, so if you
constrain it to the ISP local server only, you're losing part of the point
of P2P for no great saving in peering/transit.

However, it's going to be competing with a deeply-entrenched pirate

culture, so the key will be attractive new users who aren't technical
enough to use the existing tools via an easy-to-use interface.  Not
surprisingly, the same folks are working on deals to integrate BT (the
protocol) into STBs, routers, etc. so that users won't even know what's
going on beneath the surface -- they'll just see a TiVo-like interface
and pay a monthly fee like with cable.



As long as they don't interfere with the user's right to choose someone
else's content, fine.

Alex


Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-21 Thread Petri Helenius


Gian Constantine wrote:


I agree with you. From a consumer standpoint, a trickle or off-peak 
download model is the ideal low-impact solution to content delivery. 
And absolutely, a 500GB drive would almost be overkill on space for 
disposable content encoded in H.264. Excellent SD (480i) content can 
be achieved at ~1200 to 1500kbps, resulting in about a 1GB file for a 
90 minute title. HD is almost out of the question for internet 
download, given good 720p at ~5500kbps, resulting in a 30GB file for a 
90 minute title.


Kilobits, not bytes. So it's 3.7GB for 720p 90minutes at 5.5Mbps. 
Regularly transferred over the internet.
Popular content in the size category 2-4GB has tens of thousands and in 
some cases hundreds of thousands of downloads from a single tracker. 
Saying it's out of question does not make it go away. But denial is 
usually the first phase anyway.


Pete




Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-21 Thread Joe Abley



On 21-Jan-2007, at 07:14, Alexander Harrowell wrote:

Regarding your first point, it's really surprising that existing  
P2P applications don't include topology awareness. After all, the  
underlying TCP already has mechanisms to perceive the relative  
nearness of a network entity - counting hops or round-trip latency.  
Imagine a BT-like client that searches for available torrents, and  
records the round-trip time to each host it contacts. These it  
places in a lookup table and picks the fastest responders to  
initiate the data transfer. Those are likely to be the closest, if  
not in distance then topologically, and the ones with the most  
bandwidth. Further, imagine that it caches the search -  so when  
you next seek a file, it checks for it first on the hosts nearest  
to it in its routing table, stepping down progressively if it's  
not there. It's a form of local-pref.


Remember though that the dynamics of the system need to assume that  
individual clients will be selfish, and even though it might be in  
the interests of the network as a whole to choose local peers, if you  
can get faster *throughput* (not round-trip response) from a remote  
peer, it's a necessary assumption that the peer will do so.


Protocols need to be designed such that a client is rewarded in  
faster downloads for uploading in a fashion that best benefits the  
swarm.



The third step is for content producers to directly add their torrents
to the ISP peers before releasing the torrent directly to the public.
This gets official content pre-positioned for efficient  
distribution,

making it perform better (from a user's perspective) than pirated
content.


If there was a big fast server in every ISP with a monstrous pile of  
disk which retrieved torrents automatically from a selection of  
popular RSS feeds, which kept seeding torrents for as long as there  
was interest and/or disk, and which had some rate shaping installed  
on the host such that traffic that wasn't on-net (e.g. to/from  
customers) or free (e.g. to/from peers) was rate-crippled, how far  
would that go to emulating this behaviour with existing live  
torrents? Speaking from a technical perspective only, and ignoring  
the legal minefield.


If anybody has tried this, I'd be interested to hear whether on-net  
clients actually take advantage of the local monster seed, or whether  
they persist in pulling data from elsewhere.



Joe



Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-21 Thread D.H. van der Woude

There 's other developments as well...

Simple Minds and Motorpyscho live. Mashed Up.

Still need to get a better grip on what the new world of Mashup business
models
http://www.capgemini.com/ctoblog/2006/11/mashup_corporations_the_shape.phpreally
is leading to? Have a look at this new mashup service of
Fabchannelhttp://www.fabchannel.com/:
until now 'just' an award-winning website which gave its members access to
videos of rock concerts in Amsterdam's famous
Paradisohttp://www..paradiso.nl/concert hall. Not any more. Today
Fabchannel launched a new,
unique 
servicehttp://fabchannel.blogspot.com/2007/01/fabchannel-releases-unique-embedded.htmlwhich
enables music fans to create their own, custom made concert videos and
then share them with others through their blogs, community profiles,
websites or any other application.

So suppose you have this weird music taste, which sort of urges you to
create an ideal concert featuring the Simple Minds, Motorpsycho, The Fun
Loving Criminals, Ojos de Brujo and Bauer  the Metrople Orchestra. *Just
suppose it's true*. The only thing you need to do is click this concert
together at Fabchannel's site – choosing from the many hundreds of videos
available -, customize it with your own tags, image and description and then
have Fabchannel automatically create the few lines of html code that you
need to embed this tailor-made concert in whatever web application you want.

As Fabchannel put it in their announcement, this makes live concerts
available to fans all over the world. Not centralised in one place, but
where the fans gather online. And this is precisely the major concept
behind the Mashup Corporation http://www.mashupcorporations.com/: - supply
the outside world with simple, embeddable, services – support and facilitate
the community that starts to use them and – watch growth and innovation take
place in many unexpected ways.

Fabchannel expects to attract many more fans than they currently do. Not by
having more hits at their website, but rather through the potentially
thousands and thousands of blogs, myspace pages, websites, forums and
desktop widgets that all could reach their own niche group of music fans,
mashing up the Fabplayer service with many other services that the
Fabchannel crew – no matter how creative – would have never thought of.

Maximise your growth, attract less people to your site. Sounds like a
paradox. But not in a Mashup world.

By all means view my customised concert, underneath. I'm particularly fond
of the Barcelonan band Ojos de Brujo, with their very special mix of classic
flamenco, hip hop and funk. Mashup music indeed. In all respects.
http://www.capgemini.com/ctoblog/2007/01/simple_minds_and_motorpyscho_l.php

On 1/21/07, Joe Abley [EMAIL PROTECTED] wrote:




On 21-Jan-2007, at 07:14, Alexander Harrowell wrote:

 Regarding your first point, it's really surprising that existing
 P2P applications don't include topology awareness. After all, the
 underlying TCP already has mechanisms to perceive the relative
 nearness of a network entity - counting hops or round-trip latency.
 Imagine a BT-like client that searches for available torrents, and
 records the round-trip time to each host it contacts. These it
 places in a lookup table and picks the fastest responders to
 initiate the data transfer. Those are likely to be the closest, if
 not in distance then topologically, and the ones with the most
 bandwidth. Further, imagine that it caches the search -  so when
 you next seek a file, it checks for it first on the hosts nearest
 to it in its routing table, stepping down progressively if it's
 not there. It's a form of local-pref.

Remember though that the dynamics of the system need to assume that
individual clients will be selfish, and even though it might be in
the interests of the network as a whole to choose local peers, if you
can get faster *throughput* (not round-trip response) from a remote
peer, it's a necessary assumption that the peer will do so.

Protocols need to be designed such that a client is rewarded in
faster downloads for uploading in a fashion that best benefits the
swarm.

 The third step is for content producers to directly add their torrents
 to the ISP peers before releasing the torrent directly to the public.
 This gets official content pre-positioned for efficient
 distribution,
 making it perform better (from a user's perspective) than pirated
 content.

If there was a big fast server in every ISP with a monstrous pile of
disk which retrieved torrents automatically from a selection of
popular RSS feeds, which kept seeding torrents for as long as there
was interest and/or disk, and which had some rate shaping installed
on the host such that traffic that wasn't on-net (e.g. to/from
customers) or free (e.g. to/from peers) was rate-crippled, how far
would that go to emulating this behaviour with existing live
torrents? Speaking from a technical perspective only, and ignoring
the legal minefield.

If anybody has 

Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-21 Thread Petri Helenius


Joe Abley wrote:


If anybody has tried this, I'd be interested to hear whether on-net 
clients actually take advantage of the local monster seed, or whether 
they persist in pulling data from elsewhere.


The local seed would serve bulk of the data because as soon as a piece 
is served from it, the client issues a new request and if the latency 
and bandwidth is there, as is the case for ADSL/cable clients, usually 
80% of a file is served locally.

I don't think additional optimization is done nor needed in the client.

Pete



Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-21 Thread Stephen Sprunk


[ Note: please do not send MIME/HTML messages to mailing lists ]

Thus spake Alexander Harrowell
Good thinking. Where do I sign? Regarding your first point, it's 
really

surprising that existing P2P applications don't include topology
awareness. After all, the underlying TCP already has mechanisms
to perceive the relative nearness of a network entity - counting hops
or round-trip latency. Imagine a BT-like client that searches for
available torrents, and records the round-trip time to each host it
contacts. These it places in a lookup table and picks the fastest
responders to initiate the data transfer. Those are likely to be the
closest, if not in distance then topologically, and the ones with the
most bandwidth.


The BT algorithm favors peers with the best performance, not peers that 
are close.  You can rail against this all you want, but expecting users 
to do anything other than local optimization is a losing proposition.


The key is tuning the network so that local optimization coincides with 
global optimization.  As I said, I often get 10x the throughput with 
peers in Europe vs. peers in my own city.  You don't like that?  Well, 
rate-limit BT traffic at the ISP border and _don't_ rate-limit within 
the ISP.  (s/ISP/POP/ if desired)  Make the cheap bits fast and a the 
expensive bits slow, and clients will automatically select the cheapest 
path.



Further, imagine that it caches the search -  so when you next seek
a file, it checks for it first on the hosts nearest to it in its 
routing

table, stepping down progressively if it's not there. It's a form of
local-pref.


Experience shows that it's not necessary, though if it has a non-trivial 
positive effect I wouldn't be surprised if it shows up someday.



It's a nice idea to collect popularity data at the ISP level, because
the decision on what to load into the local torrent servers could be
automated.


Note that collecting popularity data could be done at the edges without 
forcing all tracker requests through a transparent proxy.


Once torrent X reaches a certain trigger level of popularity, the 
local

server grabs it and begins serving, and the local-pref function on the
clients finds it. Meanwhile, we drink coffee.  However, it's a 
potential

DOS magnet - after all, P2P is really a botnet with a badge.


I don't see how.  If you detect that N customers are downloading a 
torrent, then having the ISP's peer download that torrent and serve it 
to the customers means you consume 1/N upstream bandwidth.  That's an 
anti-DOS :)



And the point of a topology-aware P2P client is that it seeks the
nearest host, so if you constrain it to the ISP local server only, 
you're
losing part of the point of P2P for no great saving in 
peering/transit.


That's why I don't like the idea of transparent proxies for P2P; you can 
get 90% of the effect with 10% of the evilness by setting up sane 
rate-limits.


As long as they don't interfere with the user's right to choose 
someone

else's content, fine.


If you're getting it from an STB, well, there may not be a way for users 
to add 3rd party torrents; how many users will be able to figure out how 
to add the torrent URLs (or know where to find said URLs) even if there 
is an option?  Remember, we're talking about Joe Sixpack here, not 
techies.


You would, however, be able to pick whatever STB you wanted (unless ISPs 
deliberately blocked competitors' services).


S

Stephen Sprunk God does not play dice.  --Albert Einstein
CCIE #3723 God is an inveterate gambler, and He throws the
K5SSSdice at every possible opportunity. --Stephen Hawking 



Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-21 Thread Alexander Harrowell

Sprunk:


 It's a nice idea to collect popularity data at the ISP level, because
 the decision on what to load into the local torrent servers could be
 automated.

Note that collecting popularity data could be done at the edges without
forcing all tracker requests through a transparent proxy.



Yes. This is my point. It's a good thing to do, but centralising it is an
ungood thing to do, because...


Once torrent X reaches a certain trigger level of popularity, the
 local
 server grabs it and begins serving, and the local-pref function on the
 clients finds it. Meanwhile, we drink coffee.  However, it's a
 potential
 DOS magnet - after all, P2P is really a botnet with a badge.

I don't see how.  If you detect that N customers are downloading a
torrent, then having the ISP's peer download that torrent and serve it
to the customers means you consume 1/N upstream bandwidth.  That's an
anti-DOS :)



All true. My point is that forcing all tracker requests through a proxy
makes that machine an obvious DDOS target. It's got to have an open
interface to all hosts on your network on one side, and to $world on the
other, and if it goes down, then everyone on your network loses service. And
you're expecting traffic distributed over a large number of IP addresses
because it's a P2P application, so distinguishing normal traffic from a
botnet attack will be hard.


And the point of a topology-aware P2P client is that it seeks the
 nearest host, so if you constrain it to the ISP local server only,
 you're
 losing part of the point of P2P for no great saving in
 peering/transit.

That's why I don't like the idea of transparent proxies for P2P; you can
get 90% of the effect with 10% of the evilness by setting up sane
rate-limits.



OK.


As long as they don't interfere with the user's right to choose
 someone
 else's content, fine.

If you're getting it from an STB, well, there may not be a way for users
to add 3rd party torrents; how many users will be able to figure out how
to add the torrent URLs (or know where to find said URLs) even if there
is an option?  Remember, we're talking about Joe Sixpack here, not
techies.

You would, however, be able to pick whatever STB you wanted (unless ISPs
deliberately blocked competitors' services).



Please. Joe has a right to know these things. How long before Joe finds out
anyway?


Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-21 Thread Stephen Sprunk


Thus spake Joe Abley [EMAIL PROTECTED]
If there was a big fast server in every ISP with a monstrous pile of 
disk which retrieved torrents automatically from a selection of 
popular RSS feeds, which kept seeding torrents for as long as there 
was interest and/or disk, and which had some rate shaping installed 
on the host such that traffic that wasn't on-net (e.g. to/from 
customers) or free (e.g. to/from peers) was rate-crippled, how far 
would that go to emulating this behaviour with existing live 
torrents?


Every torrent indexing site I'm aware of has RSS feeds for newly-added 
torrents, categorized many different ways.  Any ISP that wanted to set 
up such a service could do so _today_ with _existing_ tools.  All that's 
missing is the budget and a go-ahead from the lawyers.



Speaking from a technical perspective only, and ignoring the legal
minefield.


Aside from that, Mrs. Lincoln, how was the play?

If anybody has tried this, I'd be interested to hear whether on-net 
clients actually take advantage of the local monster seed, or whether 
they persist in pulling data from elsewhere.


Clients pull data from everywhere that'll send it to them.  The 
important thing is what percentage of the bits come from where.  If I 
can reach local peers at 90kB/s and remote peers at 10kB/s, then local 
peers will end up accounting for 90% of the bits I download. 
Unfortunately, due to asymmetric connections, rate limiting, etc. it 
frequently turns out that remote peers perform better than local ones in 
today's consumer networks.


Uploading doesn't work exactly the same way, but it's similar.  During 
the leeching phase, clients will upload to a handful of peers that they 
get the best download rates from.  However, the optimistic unchoke 
algorithm will lead to some bits heading off to poorer-performing peers. 
During the seeding phase, clients will upload to a handful of peers that 
they get the best _upload_ rates to, plus a few bits off to optimistic 
unchoke peers.


Do I have hard data?  No.  Is there any reason to think real-world 
behavior doesn't match theory?  No.  I frequently stare at the Peer 
stats window on my BT client and it's doing exactly what Bram's original 
paper says it should be doing.  That I get better transfer rates with 
people in Malaysia and Poland than with my next-door neighbor is the 
ISPs' fault, not Bram's.


S

Stephen Sprunk God does not play dice.  --Albert Einstein
CCIE #3723 God is an inveterate gambler, and He throws the
K5SSSdice at every possible opportunity. --Stephen Hawking 



Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-21 Thread Joe Abley



On 21-Jan-2007, at 14:07, Stephen Sprunk wrote:

Every torrent indexing site I'm aware of has RSS feeds for newly- 
added torrents, categorized many different ways.  Any ISP that  
wanted to set up such a service could do so _today_ with _existing_  
tools.  All that's missing is the budget and a go-ahead from the  
lawyers.


Yes, I know.

If anybody has tried this, I'd be interested to hear whether on- 
net clients actually take advantage of the local monster seed, or  
whether they persist in pulling data from elsewhere.


[...] Do I have hard data?  No. [...]


So, has anybody actually tried this?

Speculating about how clients might behave is easy, but real  
experience is more interesting.



Joe



Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-21 Thread Steve Gibbard


On Sun, 21 Jan 2007, Joe Abley wrote:

Remember though that the dynamics of the system need to assume that 
individual clients will be selfish, and even though it might be in the 
interests of the network as a whole to choose local peers, if you can get 
faster *throughput* (not round-trip response) from a remote peer, it's a 
necessary assumption that the peer will do so.


It seems like if there's an issue here it's that different parties have 
different self-interests, and those whose interests aren't being served 
aren't passing on the costs to the decision makers.  The users' 
performance interests are served by getting the fastest downloads 
possible.  The ISP's financial interests would be served by their flat 
rate customers getting their data from somewhere close by.  If it becomes 
enough of a problem that the ISPs are motivated to deal with it, one 
approach would be to get the customers' financial interests better 
aligned with their own, with differentiated billing for local and long 
distance traffic.


Perth, on the West Coast of Australia, claims to be the world's most 
isolated capitol city (for some definition of capitol).  Next closest is 
probably Adelaide, at 1300 miles.  Jakarta and Sydney are both 2,000 miles 
away.  Getting stuff, including data, in and out is expensive.  Like 
Seattle, Perth has many of its ISPs in the same downtown sky scraper, and 
a very active exchange point in the building.  It is much cheaper for ISPs 
to hand off local traffic to each other than to hand off long distance 
traffic to their far away transit providers.  Like ISPs in a lot of 
similar places, the ISPs in Perth charge their customers different rates 
for cheap local bandwidth than for expensive long distance bandwidth.


When I was in Perth a couple of years ago, I asked my usual questions 
about what effect this billing arrangement was having on user behavior. 
I was told about a Perth-only file sharing network.  Using the same file 
sharing networks as the rest of the world was expensive, as they would end 
up hauling lots of data over the expensive long distance links and users 
didn't want to pay for that.  Instead, they'd put together their own, 
which only allowed local users and thus guaranteed that uploads and 
downloads would happen at cheap local rates.


Googling for more information just now, what I found were lots of stories 
about police raids, so I'm not sure if it's still operational.  Legal 
problems seem to be an issue for file sharing networks regardless of 
geographic focus, so that's probably not relevant to this particular 
point.


In the US and Western Europe, there's still enough fiber between cities 
that high volumes of long distance traffic don't seem to be causing 
issues, and pricing is becoming less distance sensitive.  The parts of the 
world with shortages of external connectivity pay to get to us, so we 
don't see those costs either.  If that changes, I suspect we'll see it 
reflected in the pricing models and user self-interests will change.  The 
software that users will be using will change accordingly, as it did in 
Perth.


-Steve


Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-21 Thread Alexander Harrowell

Gibbard:

It seems like if there's an issue here it's that different parties
have different
self-interests, and those whose interests aren't being served


aren't passing on the costs to the decision makers.  The users'
performance interests are served by getting the fastest downloads
possible.  The ISP's financial interests would be served by their flat
rate customers getting their data from somewhere close by.  If it becomes
enough of a problem that the ISPs are motivated to deal with it, one
approach would be to get the customers' financial interests better
aligned with their own, with differentiated billing for local and long
distance traffic.



That could be seen as a confiscation of a major part of the value customers
derive from ISPs.

Perth, on the West Coast of Australia, claims to be the world's most

isolated capitol city (for some definition of capitol).  Next closest is
probably Adelaide, at 1300 miles.  Jakarta and Sydney are both 2,000 miles
away.  Getting stuff, including data, in and out is expensive.  Like
Seattle, Perth has many of its ISPs in the same downtown sky scraper, and
a very active exchange point in the building.  It is much cheaper for ISPs
to hand off local traffic to each other than to hand off long distance
traffic to their far away transit providers.  Like ISPs in a lot of
similar places, the ISPs in Perth charge their customers different rates
for cheap local bandwidth than for expensive long distance bandwidth.

When I was in Perth a couple of years ago, I asked my usual questions
about what effect this billing arrangement was having on user behavior.
I was told about a Perth-only file sharing network.  Using the same file
sharing networks as the rest of the world was expensive, as they would end
up hauling lots of data over the expensive long distance links and users
didn't want to pay for that.  Instead, they'd put together their own,
which only allowed local users and thus guaranteed that uploads and
downloads would happen at cheap local rates.

Googling for more information just now, what I found were lots of stories
about police raids, so I'm not sure if it's still operational.



Brendan Behan: There is no situation that cannot be made worse by the
presence of a policeman.

-Steve




Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-21 Thread Perry Lorier




Good thinking. Where do I sign? Regarding your first point, it's really
surprising that existing P2P applications don't include topology awareness.
After all, the underlying TCP already has mechanisms to perceive the
relative nearness of a network entity - counting hops or round-trip 
latency.

Imagine a BT-like client that searches for available torrents, and records
the round-trip time to each host it contacts. These it places in a lookup
table and picks the fastest responders to initiate the data transfer. Those
are likely to be the closest, if not in distance then topologically, and 
the

ones with the most bandwidth. Further, imagine that it caches the search -
so when you next seek a file, it checks for it first on the hosts 
nearest to

it in its routing table, stepping down progressively if it's not there.
It's a form of local-pref.


When I investigated bit torrent clients a couple of years ago, the 
tracker would only send you a small subset of it's peers at random, so 
as a client you often weren't told about the peer that was right beside 
you.  Trackers could in theory send you peers that were close to you (eg 
send you anyone thats in the same /24, a few from the same /16, a few 
more from the same /8 and a handful from other places.  But the tracker 
has no idea which areas you get good speeds to, and generally wants to 
be as simple as possible.


Also in most unixes you can query the tcp stack to ask for it's current 
estimate of the rtt on a TCP connection with:


#include sys/types.h
#include sys/socket.h
#include netinet/tcp.h
#include stdio.h

int fd;
struct tcp_info tcpinfo;
socklen_t len = sizeof(tcpinfo);

if (getsockopt(fd,SOL_TCP,TCP_INFO,tcpinfo,len)!=-1) {
  printf(estimated rtt: %.04f (seconds), tcpinfo.tcpi_rtt/100.0);
}

Due to rate limiting you can often find you'll get very similar 
performance to a reasonably large subset of your peers, so using tcp's 
rtt estimate as a tie breaker might provide a reasonable cost savings to 
the ISP (although the end user probably won't notice the difference)




Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-21 Thread Gian Constantine

Actually, I acknowledged the calculation mistake in a subsequent post.

Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.


On Jan 21, 2007, at 11:11 AM, Petri Helenius wrote:


Gian Constantine wrote:


I agree with you. From a consumer standpoint, a trickle or off- 
peak download model is the ideal low-impact solution to content  
delivery. And absolutely, a 500GB drive would almost be overkill  
on space for disposable content encoded in H.264. Excellent SD  
(480i) content can be achieved at ~1200 to 1500kbps, resulting in  
about a 1GB file for a 90 minute title. HD is almost out of the  
question for internet download, given good 720p at ~5500kbps,  
resulting in a 30GB file for a 90 minute title.


Kilobits, not bytes. So it's 3.7GB for 720p 90minutes at 5.5Mbps.  
Regularly transferred over the internet.
Popular content in the size category 2-4GB has tens of thousands  
and in some cases hundreds of thousands of downloads from a single  
tracker. Saying it's out of question does not make it go away.  
But denial is usually the first phase anyway.


Pete






Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-21 Thread Travis H.
On Sun, Jan 21, 2007 at 12:14:56PM +, Alexander Harrowell wrote:
 After all, the underlying TCP already has mechanisms to perceive the
 relative nearness of a network entity - counting hops or round-trip latency.
 Imagine a BT-like client that searches for available torrents, and records
 the round-trip time to each host it contacts. These it places in a lookup
 table and picks the fastest responders to initiate the data transfer.

Better yet, I was reading some introductory papers on machine learning,
and there are a number of algorithms for learning.  The one I think might
be relevant is to use these various network parameters to predict high
speed downloads, and treat as oracles, adjusting their weights to reflect
their judgement accuracy.  They typically give performance e-close to the
best expert, and can easily learn which expert is the best over time,
even if that changes.
-- 
``Unthinking respect for authority is the greatest enemy of truth.''
-- Albert Einstein -- URL:http://www.subspacefield.org/~travis/


pgpCi9SmdUT4p.pgp
Description: PGP signature


Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-21 Thread Travis H.
On Sun, Jan 21, 2007 at 06:15:52PM +0100, D.H. van der Woude wrote:
 Simple Minds and Motorpyscho live. Mashed Up.
 Still need to get a better grip on what the new world of Mashup business
 models

Are mashups like:
http://www.popmodernism.org/scrambledhackz/

-- 
``Unthinking respect for authority is the greatest enemy of truth.''
-- Albert Einstein -- URL:http://www.subspacefield.org/~travis/


pgp7HSnLERKGm.pgp
Description: PGP signature


Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-20 Thread Stephen Sprunk


Thus spake Dave Israel [EMAIL PROTECTED]

The past solution to repetitive requests for the same content has been
caching, either reactive (webcaching) or proactive (Akamaizing.)  I
think it is the latter we will see; service providers will push
reasonably cheap servers close to the edge where they aren't too
oversubscribed, and stuff their content there.  A cluster of servers
with terabytes of disk at a regional POP will cost a lot less than
upgrading the upstream links.  And even if the SPs do not want to
invest in developing this product platform for themselves, the price
will likely be paid by the content providers who need performance to
keep subscribers.


Caching per se doesn't apply to P2P networks, since they already do that 
as part of their normal operation.  The key is getting users to contact 
peers who are topologically closer, limiting the bits * distance 
product.  It's ridiculous that I often get better transfer rates with 
peers in Europe than with ones a few miles away.  The key to making 
things more efficient is not to limit the bandwidth to/from the customer 
premise, but limit it leaving the POP and between ISPs.  If I can 
transfer at 100kB/s from my neighbors but only 10kB/s from another 
continent, my opportunistic client will naturally do what my ISP wants 
as a side effect.


The second step, after you've relocated the rate limiting points, is for 
ISPs to add their own peers in each POP.  Edge devices would passively 
detect when more than N customers have accessed the same torrent, and 
they'd signal the ISP's peer to add them to its list.  That peer would 
then download the content, and those N customers would get it from the 
ISP's peer.  Creative use of rate limits and acess control could make it 
even more efficient, but they're not strictly necessary.


The third step is for content producers to directly add their torrents 
to the ISP peers before releasing the torrent directly to the public. 
This gets official content pre-positioned for efficient distribution, 
making it perform better (from a user's perspective) than pirated 
content.


The two great things about this are (a) it doesn't require _any_ changes 
to existing clients or protocols since it exploits existing behavior, 
and (b) it doesn't need to cover 100% of the content or be 100% 
reliable, since if a local peer isn't found with the torrent, the 
clients will fall back to their existing behavior (albeit with lower 
performance).


One thing that _does_ potentially break existing clients is forcing all 
of the tracker (including DHT) requests through an ISP server.  The ISP 
could then collect torrent popularity data in one place, but more 
importantly it could (a) forward the request upstream, replacing the IP 
with its own peer, and (b) only inform clients of other peers (including 
the ISP one) using the same intercept point.  This looks a lot more like 
a traditional transparent cache, with the attendant reliability and 
capacity concerns, but I wouldn't be surprised if this were the first 
mechanism to make it to market.



I think the biggest stumbling block isn't technical.  It is a question
of getting enough content to attract viewers, or alternately, getting
enough viewers to attract content.  Plus, you're going to a format
where the ability to fast-forward commercials is a fact, not a risk,
and you'll have to find a way to get advertisers' products in front of
the viewer to move past pay-per-view.  It's all economics and politics
now.


I think BitTorrent Inc's recent move is the wave of the short-term 
future: distribute files freely (and at low cost) via P2P, but 
DRM-protect the files so that people have to acquire a license to open 
the files.  I can see a variety of subscription models that could pay 
for content effectively under that scheme.


However, it's going to be competing with a deeply-entrenched pirate 
culture, so the key will be attractive new users who aren't technical 
enough to use the existing tools via an easy-to-use interface.  Not 
surprisingly, the same folks are working on deals to integrate BT (the 
protocol) into STBs, routers, etc. so that users won't even know what's 
going on beneath the surface -- they'll just see a TiVo-like interface 
and pay a monthly fee like with cable.


S

Stephen Sprunk God does not play dice.  --Albert Einstein
CCIE #3723 God is an inveterate gambler, and He throws the
K5SSSdice at every possible opportunity. --Stephen Hawking 



Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-19 Thread Travis H.
On Sat, Jan 13, 2007 at 06:11:32PM -0800, Roland Dobbins wrote:
 This is a very important point - perceived disintermediation,  
 perceived unbundling, ad reduction/elimination, and timeshifting are  
 the main reasons that DVRs are so popular

I am an unusual case, not having much time or interest in passive
entertainment, but I have moved to a MythTV box for my entertainment
center.  I don't have cable TV and my broadcast quality is such that
I don't bother with it.  I can find sufficient things on the net to
occupy those idle times, and can watch them on my limited schedule
and terms.  The BBC in particular has some interesting documentaries,
and I point you to a doubly relevant video series below.

Some others have mentioned that a pay system that was significantly
easier to use than the infringing technologies would turn the tide in
illicit copying.

Those interested in the direction things are going should read up on
Peter Gutmann's paper on the costs of Vista Content Protection.  It
is unfortunate the content owners are more interested in making illicit
copying hard than in making legal purchase and use of the content easy.

I don't intend to pay for systems that I don't control, don't intend
to store my data in formats I don't have documentation for, and don't
anticipate paying for DRM-encoded files ever, mostly because I'd have
to pay for a crippled system which reminds me of buying a car with the
hood welded shut in order to have the privilege of renting content.
Usually in such situations the industry is willing to engage in some
loss leaders; I'd take a free crippled media player, but probably in
the end would resent its closed nature, its lack of flexibility or
expandability, and all the things that led me to personal computers
and software in the first place.

 As to an earlier comment about video editing in order to remove ads,  
 this is apparently the norm in the world of people who are heavy  
 uploaders/crossloaders of video content via P2P systems.  It seems  
 there are different 'crews' who compete to produce a 'quality  
 product' in terms of the quality of the encoding, compression,  
 bundling/remixing, etc.; it's very reminiscent of the 'warez' scene  
 in that regard.

This is an interesting free video series on the illicit movie copying
scene:

http://www.welcometothescene.com/

It is somewhat unusual in that most of the videos are split screenshots,
and most of the conversation is typed, and that an understanding of
various technical topics is necessary to be able to follow the show at
all.

 It's an interesting  
 question as to whether or not the energy and 'professional pride' of  
 this group of people could somehow be harnessed in order to provide
 and distribute content legally (as almost all of what people really  
 want seems to be infringing content under the current standard  
 model), and monetized so that they receive compensation and  
 essentially act as the packaging and distribution arm for content  
 providers willing to try such a model.

IMHO I fail to see how they would be (or remain) any different from
the current distribution channels.  It's akin to asking if the
open-source community could somehow be harnessed and paid for creating
software.  Yes; it's already being done, and there are qualitative
differences in the results.  When there is no financial interest,
artisanship and craftsmanship predominate as motivators.  When driven
by financial interests, often those languish, and the market forces of
suckification move the product inexorably from one which is the most
desirable to use, to one with as many built-in annoyances and
advertisements as the end-user will tolerate, all the useless features
necessary to confuse the purchaser into rational ignorance, and all
plausible mechanisms to lock the user in over time (or otherwise raise
their switching costs).  But I'm not cynical... ;-)

This is way off charter, but I recently read of a study where art students
were asked to create some artwork.  One group was given a financial
reward.  The results were anonymized, and evaluators judged the results.
Once unblinded, the study found that the group with the financial reward
was statistically significantly judged as less creative and as producing
lower-quality work.

 As a side note, it seems there's a growing phenomenon of 'upload
 cheating' taking place in the BitTorrent space, with clients such as
 BitTyrant and BitThief becoming more and more popular while at the
 same time disrupting the distribution economies of P2P networks.
 This has caused a great deal of consternation in the infringing-
 oriented P2P community of interest, with the developers/operators of
 various BitTorrent-type systems such as BitComet working at
 developing methods of detecting and blocking downloading from users
 who 'cheat' in this fashion; it is instructive (and more than a
 little ironic) to watch as various elements within the infringing-
 oriented P2P community attempt to 

Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-17 Thread Travis H.
On Sat, Jan 06, 2007 at 02:35:25PM +, Colm MacCarthaigh wrote:
 Oh I should be clear too. We use SI powers of 10, just like for
 bandwidth, not powers of two like for storage. We quote in Megabytes
 because caps are usually in gigabytes, so it's more clear for users.

IEC 60027-2 prefixes eliminate the ambiguity.

http://en.wikipedia.org/wiki/Binary_prefix

Basically, to make base-2, replace letters 3 and 4 of the prefix with
bi for binary.
-- 
``Unthinking respect for authority is the greatest enemy of truth.''
-- Albert Einstein -- URL:http://www.subspacefield.org/~travis/


pgpg0Yf0xeWBJ.pgp
Description: PGP signature


Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-16 Thread Peter Corlett

On Tue, Jan 16, 2007 at 11:53:25AM +1300, Richard Naylor wrote:
[...]
 I don't see many obstacles for content and neither do other broadcasters.
 The broadcast world is changing. Late last year ABC or NBC (sorry brain
 fade) announced the lay off of 700 News staff, saying news is no longer
 king.

Was it ever? Allegedly Murdoch's Sky only launched their Sky News channel so
they could claim to be a reputable broadcaster.



Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-15 Thread Andy Davidson



On 12 Jan 2007, at 15:26, Gian Constantine wrote:

I am pretty sure we are not becoming a VoD world. Linear  
programming is much better for advertisers. I do not think content  
providers, nor consumers, would prefer a VoD only service. A  
handful of consumers would love it, but many would not.


There are already cheap and efficient ways of doing VoD-like services  
with a PVR - I timeshift almost everything that I want to watch  
because it's on at inconvenient times.  So shows get spooled to disk  
whilst they're broadcasted efficiently, and I can watch them later.


Any sort of Broadcast-Video-over-IP system that employed that  
technology would be a winner.  You don't need to 'broadcast' the show  
in real time either if it's going to be spooled to disk, even as it  
is viewed.


-a



Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-15 Thread Michal Krsek



I am pretty sure we are not becoming a VoD world. Linear programming 
is much better for advertisers. I do not think content providers, nor 
consumers, would prefer a VoD only service. A handful of consumers 
would love it, but many would not.


There are already cheap and efficient ways of doing VoD-like services 
with a PVR - I timeshift almost everything that I want to watch 
because it's on at inconvenient times.  So shows get spooled to disk 
whilst they're broadcasted efficiently, and I can watch them later.


Any sort of Broadcast-Video-over-IP system that employed that 
technology would be a winner.  You don't need to 'broadcast' the show 
in real time either if it's going to be spooled to disk, even as it is 
viewed.
This system works perfectly in our linear-line distribution (channels). 
As user you can choose time you want to see the show, but not the show 
itself. Capacity on PVR device is finite and if you don't want to waste 
the space with any broadcasted content you have to program the device. I 
have ten channels in my cable TV and sometimes I'm confused what to 
record. Beeing in the US and paid for ~100 channels will make me mad to 
crawl channel schedules :-)


So the technology is nice, but not a What you want is what you get. So 
you cannot address the long tail using this technology.


  Regards
 Michal



Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-15 Thread Joe Abley



On 15-Jan-2007, at 08:48, Michal Krsek wrote:

This system works perfectly in our linear-line distribution  
(channels). As user you can choose time you want to see the show,  
but not the show itself. Capacity on PVR device is finite and if  
you don't want to waste the space with any broadcasted content you  
have to program the device. I have ten channels in my cable TV and  
sometimes I'm confused what to record. Beeing in the US and paid  
for ~100 channels will make me mad to crawl channel schedules :-)


So the technology is nice, but not a What you want is what you  
get. So you cannot address the long tail using this technology.


These are all UI details.

The (Scientific Atlanta, I think) PVRs that Rogers Cable gives  
subscribers here in Ontario let you specify the *names* of shows that  
you like, rather than selecting specific channels and times; I seem  
to think you can also tell it to automatically ditch old recorded  
material when disk space becomes low.


One thing that may not be obvious to people who haven't had this  
misfortune of consuming it at first hand is that North American TV,  
awash with channels as it is, contains a lot of duplicated content.  
The same episode of the same show might be broadcast tens of times  
per week; the same advertisement might be broadcast tens of times per  
hour.


How much more programming would the existing networks support if they  
were able to reduce those retransmissions, relying on the ubiquity of  
set-top boxes with PVR functionality?



Joe



Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-15 Thread Gian Constantine
The problem with this all (or mostly) VoD model is the entrenched  
culture. In countries outside of the U.S. with smaller channel  
lineups, an all VoD model might be easier to migrate to over time. In  
the U.S., where we have 200+ channel lineups, consumers have become  
accustomed to the massive variety and instant gratification of a  
linear lineup. If you leave it to the customer to choose their  
programs, and then wait for them to arrive and be viewed, the instant  
gratification aspect is lost. This is important to consumers here.


While I do not think an all or mostly VoD model will work for  
consumers in U.S. in the near term (next 5 years), it may work in the  
long term (7-10 years). There are so many obstacles in the way from a  
business side of things, though.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.

On Jan 15, 2007, at 9:31 AM, Joe Abley wrote:




On 15-Jan-2007, at 08:48, Michal Krsek wrote:

This system works perfectly in our linear-line distribution  
(channels). As user you can choose time you want to see the show,  
but not the show itself. Capacity on PVR device is finite and if  
you don't want to waste the space with any broadcasted content you  
have to program the device. I have ten channels in my cable TV and  
sometimes I'm confused what to record. Beeing in the US and paid  
for ~100 channels will make me mad to crawl channel schedules :-)


So the technology is nice, but not a What you want is what you  
get. So you cannot address the long tail using this technology.


These are all UI details.

The (Scientific Atlanta, I think) PVRs that Rogers Cable gives  
subscribers here in Ontario let you specify the *names* of shows  
that you like, rather than selecting specific channels and times; I  
seem to think you can also tell it to automatically ditch old  
recorded material when disk space becomes low.


One thing that may not be obvious to people who haven't had this  
misfortune of consuming it at first hand is that North American TV,  
awash with channels as it is, contains a lot of duplicated content.  
The same episode of the same show might be broadcast tens of times  
per week; the same advertisement might be broadcast tens of times  
per hour.


How much more programming would the existing networks support if  
they were able to reduce those retransmissions, relying on the  
ubiquity of set-top boxes with PVR functionality?



Joe





RE: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-15 Thread Bora Akyol

Steve

That's mostly because the DVR boxes given by the cable companies (mine
is a Moto from Comcast)
are terrible. The UI just plain is unusable esp for on-demand portion of
the DVR guide.

I have caught up with the thread this morning and I have to say, I don't
understand why people think of video distribution via the Internet as
channels. The only reason why channels exist is due to the medium when
TV was started. I expect the next generation of video to be a lot like
GooTube or iTunes.

Most of it is pushed while you are sleeping and a few (200) mcast
streams for live content like news, etc.

The question I asked earlier was, whether the last-mile SP networks can
handle 24x7 100% link utilization for all of their customers. I don't
think they can. And frankly, I don't know how they are going to get
revenue from the content distributors to upgrade the networks. Does
Apple reimburse Comcast (my SP) when I download a song? I don't think
so? What about a movie? Again, I don't think so.

You see where I am going with this.

Bora




 -Original Message-
 From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On 
 Behalf Of Steve Sobol
 Sent: Friday, January 12, 2007 9:37 PM
 To: Mikael Abrahamsson
 Cc: nanog@merit.edu
 Subject: Re: Network end users to pull down 2 gigabytes a 
 day, continuously?
 
 
 On Sat, 13 Jan 2007, Mikael Abrahamsson wrote:
 
  My experience is that when you show people VoD, they like it. 
 
 I have to admit the wow factor is there. But I already have 
 access to VoD 
 through my cable company and its set-top boxes. TV over IP brings my 
 family exactly zero additional benefits.
 



Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-15 Thread Dave Israel

-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
 
Bora Akyol wrote:

 The question I asked earlier was, whether the last-mile SP networks
 can handle 24x7 100% link utilization for all of their customers. I
 don't think they can. And frankly, I don't know how they are going
 to get revenue from the content distributors to upgrade the
 networks. Does Apple reimburse Comcast (my SP) when I download a
 song? I don't think so? What about a movie? Again, I don't think
 so.

 You see where I am going with this.

The past solution to repetitive requests for the same content has been
caching, either reactive (webcaching) or proactive (Akamaizing.)  I
think it is the latter we will see; service providers will push
reasonably cheap servers close to the edge where they aren't too
oversubscribed, and stuff their content there.  A cluster of servers
with terabytes of disk at a regional POP will cost a lot less than
upgrading the upstream links.  And even if the SPs do not want to
invest in developing this product platform for themselves, the price
will likely be paid by the content providers who need performance to
keep subscribers.

I think the biggest stumbling block isn't technical.  It is a question
of getting enough content to attract viewers, or alternately, getting
enough viewers to attract content.  Plus, you're going to a format
where the ability to fast-forward commercials is a fact, not a risk,
and you'll have to find a way to get advertisers' products in front of
the viewer to move past pay-per-view.  It's all economics and politics
now.

- -Dave

-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.5 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
 
iD8DBQFFq/hz+dqB2cHPe1URAkNIAJ9/juPTl45djTF3ijZdYubXdFJoqwCgiZDm
Sv2cacmnM6Lld0cRRFo9vlo=
=tFPO
-END PGP SIGNATURE-



Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-15 Thread Richard Naylor


At 09:50 a.m. 15/01/2007 -0500, Gian Constantine wrote:
The problem with this all (or mostly) VoD model is the entrenched culture. 
In countries outside of the U.S. with smaller channel lineups, an all VoD 
model might be easier to migrate to over time. In the U.S., where we have 
200+ channel lineups, consumers have become accustomed to the massive 
variety and instant gratification of a linear lineup. If you leave it to 
the customer to choose their programs, and then wait for them to arrive 
and be viewed, the instant gratification aspect is lost. This is important 
to consumers here.


While I do not think an all or mostly VoD model will work for consumers in 
U.S. in the near term (next 5 years), it may work in the long term (7-10 
years). There are so many obstacles in the way from a business side of 
things, though.


I don't see many obstacles for content and neither do other broadcasters. 
The broadcast world is changing. Late last year ABC or NBC (sorry brain 
fade) announced the lay off of 700 News staff, saying news is no longer 
king. Instead they are moving to a strategy similar to that of the BBC. ie 
lots of on-demand content on the Internet.


Rich




Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-14 Thread Mikael Abrahamsson


On Sat, 13 Jan 2007, Roland Dobbins wrote:

again a la the warez community.  It's an interesting question as to whether 
or not the energy and 'professional pride' of this group of people could 
somehow be harnessed in order to provide and distribute content legally (as 
almost all of what people really want seems to be infringing content under 
the current standard model), and monetized so that they receive compensation 
and essentially act as the packaging and distribution arm for content 
providers willing to try such a model.  A related question is just how


You make a lot of very valid points in your email, but I just had to 
respond to the above. The only reason they have for ripping, adremoving 
and distributing TV series over the internet is because there is no legal 
way to obtain these in the quality they provide. So you're right, they 
provide a service people want at a price they want (remember that people 
spend quite a lot of money on harddrives, broadband connections etc to 
give them the experience they require).


If this same experience could be enjoyed via a VoD box from a service 
provider at a low enough price that people would want to pay for it (along 
the prices I mentioned earlier) I am sure that a lot of regular people 
would switch away from getting their content via P2P and get it directly 
from the source. Why go over ripping, ad-removing, xvid-encoding, 
warez-scene, then to P2P sites, then you have to unpack the content to 
watch it, perhaps on your computer, when the content creator is sitting 
there on a perhaps 50-100 megabit/s MPEG stream of the content that you 
directly could create a high VBR MPEG4 stream from via some replication 
system, and then VoD to your home via your broadband internet connection?


There is only one reason for those people doing what they do, it's because 
the content owners want to control the distribution channel and they're 
not realising they never will be able to do that. DRM has always failed, 
systems like Macrovision, region coding (DVD), encryption (DVD) and now I 
read that the HDDVD system, are all broken and future systems will be 
broken.


So the key is convenience and quality at a low price, aka 
price/performance on the experience. Make it cheap and convenient enough 
that the current hassle is just not worth it.


--
Mikael Abrahamssonemail: [EMAIL PROTECTED]


Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-13 Thread Sean Donelan


On Fri, 12 Jan 2007, Stephen Sprunk wrote:
There is no technical challenge here; what the pirates are already doing 
works pretty well, and with a little UI work it'd even be ready for the mass 
market.  The challenges are figuring out how to pay for the pipes needed to 
deliver all these bits at consumer rates, and how to collect revenue from all 
the viewers to fairly compensate the producers -- both business problems, 
though for different folks.


Will the North American market change from using speed to volume for 
pricing Internet connections?  Web hosting and other markets around the

world already use GB/transferred packages instead of the port speed.

What happens if a 100Mbps port is $19.95/month with $1.95 per GB 
transferred up and down?  Are P2P swarms as attractive?


Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-13 Thread Marshall Eubanks


Of course, this below is for inter-domain. There is no shortage of  
multicast walled garden

deployments.

Regards
Marshall

On Jan 12, 2007, at 7:44 PM, Marshall Eubanks wrote:




On Jan 12, 2007, at 10:05 AM, Frank Bulk wrote:



If we're becoming a VOD world, does multicast play any practical  
role in

video distribution?


Not to end users.

I think multicast is used a fair amount for precaching; presumably  
that would increase in this scenario.


Regards
Marshall

P.S. Of course, I do not agree we are moving to a pure VOD world. I  
agree with Michal Krsek in this regard.




Frank

-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On  
Behalf Of

Michal Krsek
Sent: Wednesday, January 10, 2007 2:28 AM
To: Marshall Eubanks
Cc: nanog@merit.edu
Subject: Re: Network end users to pull down 2 gigabytes a day,  
continuously?



Hi Marshall,


- the largest channel has 1.8% of the audience
- 50% of the audience is in the largest 2700 channels
- the least watched channel has ~ 10 simultaneous viewers
- the multicast bandwidth usage would be 3% of the unicast.


I'm a bit skeptic for future of channels. For making money from  
the long
tail, you have to have to adapt your distribution to user's needs.  
It is not


only format, codec ... but also time frame. You can organise your  
programs
in channels, but they will not run simultaneously for all the  
users. I want

to control my TV, I don't want to my TV jockey my life.

For the distribution, you as content owner have to help the ISP  
find the
right way to distribute your content. In example: having  
distribution center


in Tier1 ISP network will make money from Tier2 ISP connected  
directly to
Tier1. Probably, having CDN (your own or pay for service) will be  
the only

one way for large scale non synchronous programing.

Regards
Michal








Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-13 Thread Mike Leber



On Sat, 13 Jan 2007, Sean Donelan wrote:
 On Fri, 12 Jan 2007, Stephen Sprunk wrote:
  There is no technical challenge here; what the pirates are already doing 
  works pretty well, and with a little UI work it'd even be ready for the 
  mass 
  market.  The challenges are figuring out how to pay for the pipes needed to 
  deliver all these bits at consumer rates, and how to collect revenue from 
  all 
  the viewers to fairly compensate the producers -- both business problems, 
  though for different folks.
 
 Will the North American market change from using speed to volume for 
 pricing Internet connections?  Web hosting and other markets around the
 world already use GB/transferred packages instead of the port speed.

The North American market started with charging per GB transfered and went
away from it because the drop in cost per Mbps for both circuits and
transit made costs low enough so that providers could statistically
multiplex their user base and offer unlimited service (unlimited for
marketing departments is being able to offer something to 99 percent of
your customer base, which explains all residential service clauses that
state unlimited doesn't really mean unlimited).

You can see this repeatedly for all sorts of products as costs have come
down in the long view.  For example, consumer Internet dialup, long
distance calling plans, local phone service plans, some aspects of cell
phone service, it might be happening with online storage right now (i.e.
google gmail/gfs and the browser plugins that let you store files in your
gmail account).

What might or might not be trending is a digression, the unlimited
service is a marketing condition that seems to occur when 99 percent of
your customer base uses less than the cost equal to the benefit of
offering unlimited service.

Mike.

+- H U R R I C A N E - E L E C T R I C -+
| Mike Leber   Direct Internet Connections   Voice 510 580 4100 |
| Hurricane Electric Web Hosting  Colocation   Fax 510 580 4151 |
| [EMAIL PROTECTED]   http://www.he.net |
+---+




Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-13 Thread Marshall Eubanks



On Jan 12, 2007, at 11:27 PM, Mikael Abrahamsson wrote:



On Fri, 12 Jan 2007, Gian Constantine wrote:

I am pretty sure we are not becoming a VoD world. Linear  
programming is much better for advertisers. I do not think content  
providers, nor consumers, would prefer a VoD only service. A  
handful of consumers would love it, but many would not.


My experience is that when you show people VoD, they like it. A lot  
of people won't abandon linear programming because it's easy to  
just watch whatever is on, but if you give them the possibility  
of watching VoD (DVD sales of TV series for instance) some will  
definately start doing both. Same thing with HDTV, until you show  
it to people they couldn't care less, but when you've shown them  
they do start to get interested.


I have been trying to find out the advertising ARPU for the cable  
companies for a prime time TV show in the US, ie how much would I  
need to pay them to get the same content but without the  
advertising, and then add the cost of VoD delivery. This is purely  
theoretical, but it would give a rough indication on what a VoD  
distribution model might cost the end user if we were to add that  
distribution channel. Does anyone know any rough figures for  
advertising ARPU per hour on

primetime? I'd love to hear it.


Generally, in the US, the content is sent to the cable company with  
Ads already inserted, although they might get their own Ad slots. You  
would need to talk to the source, i.e., the network. Since you would  
be threatening the business model of their major customers, you would  
need patience and a lot of financial backing.


For the US, an analysis by Kenneth Wilbur
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=885465   , table  
1, from this recent meeting in DC

http://www.web.virginia.edu/media/agenda.html

shows that the cost per thousand per ad (the CPM) averaged over 5  
networks and all nights of the week, was $ 24 +- 9; these
are 1/2 minute ads. The mean ad level per half-hour is 5.15 minutes,  
so that's 10.3 x $ 24 or $ 247 / hour / 1000. This is for the  
evening; rates and audiences at other times or less. So, for a 1/2  
hour evening show, on average the VOD would need to cost at least $  
0.12 US to re-coup the ad revenues. Popular shows get a higher CPM,  
so they would cost more. The Wilbur paper and some of the other  
papers at this conference present a lot of breakdown of these sorts  
of statistics, if you are interested.


Regards
Marshall



--
Mikael Abrahamssonemail: [EMAIL PROTECTED]




Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-13 Thread Mikael Abrahamsson


On Sat, 13 Jan 2007, Sean Donelan wrote:

What happens if a 100Mbps port is $19.95/month with $1.95 per GB 
transferred up and down?  Are P2P swarms as attractive?


$1.95 is outrageously expensive. Let's say we want to pass on our costs to 
the users with the highest usage:


1 megabit/s for a month is:

1/8*60*60*24*30=324000M=324 gigabytes

Let's say this 1 megabit/s costs us $20 (which is fairly high in most 
markets), that means the price of a gigabyte transferred should be $0.06, 
let's say we increase that (because of peak usage, administrative costs 
etc) to $0.2.


Now, let's include 35 gigs of traffic in each users alottment to get rid 
of usage based billing for most users (100 kilobit/s average usage) and 
add that to your above 100 meg port, and we end up with around $28, let's 
make that $29.95 a month including the 35 gigs. Hey, make it 50 gigs for 
good measure.


Now, my guess is that 90% of the users will never use more than 50 gigs, 
and if they do, their increased usage will be quite marginal, but if 
someone actually uses 5 megabit/s on average (1.6terabytes per month (not 
unheard of) that person will have to fork out some money ($300 extra per 
month).


Oh, this model would also require that you pay for bw you PRODUCE, not 
what you receive (since you cannot control that (DDoS, scanning etc)). So 
basically anyone sourcing material to the internet would have to pay in 
some way, the ones receiving wouldn't have to pay so much (only their 
monthly fee).


The bad part is that this model would most likely hinder a lot of 
content-producers from actually publishing their content, but on the other 
hand it might be a better deal to distribute content more closer to the 
customers as carriers might be inclined to let you put servers in their 
network that only can send traffic to their network, not anybody else. It 
might also preclude a model where carriers charge each other on the amount 
of incoming traffic they see from peers.


Personally, I don't think I want to see this but it does make sense in a 
economical/technical way, somewhat like road tolls.


--
Mikael Abrahamssonemail: [EMAIL PROTECTED]


Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-13 Thread Mikael Abrahamsson


On Sat, 13 Jan 2007, Marshall Eubanks wrote:


For the US, an analysis by Kenneth Wilbur
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=885465   , table 1, from 
this recent meeting in DC

http://www.web.virginia.edu/media/agenda.html


Couldn't read the PDFs so I'll just go from your below figures:

shows that the cost per thousand per ad (the CPM) averaged over 5 networks 
and all nights of the week, was $ 24 +- 9; these
are 1/2 minute ads. The mean ad level per half-hour is 5.15 minutes, so 
that's 10.3 x $ 24 or $ 247 / hour / 1000. This is for the evening; rates and 
audiences at other times or less. So, for a 1/2 hour evening show, on average 
the VOD would need to cost at least $ 0.12 US to re-coup the ad revenues. 
Popular shows get a higher CPM, so they would cost more. The Wilbur paper and 
some of the other papers at this conference present a lot of breakdown of 
these sorts of statistics, if you are interested.


Thanks for the figures. So basically if we can encode a 23 minute show (30 
minutes minus ads) into a gig of traffic the network (precomputed HD 1080i 
with high VBR) cost would be around $0.2 (figure from my previous email, 
on margin) and pay $0.2 to the content owner, they would make the same 
amount of money as they do now? So basically the marginal cost of this 
service would be around $0.4-0.5 per show, and double that for a 45 minute 
episode (current 1 hour show format)?


So question becomes whether people might be inclined to pay $1 to watch an 
adfree TV show? If they're paying $1.99 to iTunes for the actual download 
right now, they might be willing to pay $0.99 to watch it over VoD?


As you said, of course this would take enormous amount of time and effort 
to convince the content owners of this model. Wonder if ISPs would be 
interested at these levels, that's also a good question.


--
Mikael Abrahamssonemail: [EMAIL PROTECTED]


Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-13 Thread Marshall Eubanks



On Jan 13, 2007, at 6:12 AM, Marshall Eubanks wrote:




On Jan 12, 2007, at 11:27 PM, Mikael Abrahamsson wrote:



On Fri, 12 Jan 2007, Gian Constantine wrote:

I am pretty sure we are not becoming a VoD world. Linear  
programming is much better for advertisers. I do not think  
content providers, nor consumers, would prefer a VoD only  
service. A handful of consumers would love it, but many would not.


My experience is that when you show people VoD, they like it. A  
lot of people won't abandon linear programming because it's easy  
to just watch whatever is on, but if you give them the  
possibility of watching VoD (DVD sales of TV series for instance)  
some will definately start doing both. Same thing with HDTV, until  
you show it to people they couldn't care less, but when you've  
shown them they do start to get interested.


I have been trying to find out the advertising ARPU for the cable  
companies for a prime time TV show in the US, ie how much would I  
need to pay them to get the same content but without the  
advertising, and then add the cost of VoD delivery. This is purely  
theoretical, but it would give a rough indication on what a VoD  
distribution model might cost the end user if we were to add that  
distribution channel. Does anyone know any rough figures for  
advertising ARPU per hour on

primetime? I'd love to hear it.


Generally, in the US, the content is sent to the cable company with  
Ads already inserted, although they might get their own Ad slots.  
You would need to talk to the source, i.e., the network. Since you  
would be threatening the business model of their major customers,  
you would need patience and a lot of financial backing.


For the US, an analysis by Kenneth Wilbur
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=885465   , table  
1, from this recent meeting in DC

http://www.web.virginia.edu/media/agenda.html

shows that the cost per thousand per ad (the CPM) averaged over 5  
networks and all nights of the week, was $ 24 +- 9; these
are 1/2 minute ads. The mean ad level per half-hour is 5.15  
minutes, so that's 10.3 x $ 24 or $ 247 / hour / 1000. This


Sorry, that should be

per half-hour

(i.e., there are 10.3 half-minute ads per half-hour on average.)

is for the evening; rates and audiences at other times or less. So,  
for a 1/2 hour evening show, on average the VOD would need to cost  
at least $ 0.12 US to re-coup the ad revenues. Popular shows get a  
higher CPM, so they would cost


So that should be  $ 0.25 per half hour per person.

I think that the advertising world needs a more metric system of  
measuring things and that I need some coffee.


more. The Wilbur paper and some of the other papers at this  
conference present a lot of breakdown of these sorts of statistics,  
if you are interested.


Regards
Marshall



Regards



--
Mikael Abrahamssonemail: [EMAIL PROTECTED]






Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-13 Thread Marshall Eubanks


Dear Mikael;

On Jan 13, 2007, at 6:45 AM, Mikael Abrahamsson wrote:



On Sat, 13 Jan 2007, Marshall Eubanks wrote:


For the US, an analysis by Kenneth Wilbur
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=885465   ,  
table 1, from this recent meeting in DC

http://www.web.virginia.edu/media/agenda.html


Couldn't read the PDFs so I'll just go from your below figures:

shows that the cost per thousand per ad (the CPM) averaged over 5  
networks and all nights of the week, was $ 24 +- 9; these
are 1/2 minute ads. The mean ad level per half-hour is 5.15  
minutes, so that's 10.3 x $ 24 or $ 247 / hour / 1000. This is for  
the evening; rates and audiences at other times or less. So, for a  
1/2 hour evening show, on average the VOD would need to cost at  
least $ 0.12 US to re-coup the ad revenues. Popular shows get a  
higher CPM, so they would cost more. The Wilbur paper and some of  
the other papers at this conference present a lot of breakdown of  
these sorts of statistics, if you are interested.


Thanks for the figures. So basically if we can encode a 23 minute  
show (30 minutes minus ads) into a gig of traffic the network  
(precomputed HD 1080i with high VBR) cost would be around $0.2  
(figure from my previous email, on margin) and pay $0.2 to the  
content owner, they would make the same amount of money as they do  
now? So basically the marginal cost of this service would be around  
$0.4-0.5 per show, and double that for a 45 minute episode (current  
1 hour show format)?




Yes - you saw I made a factor of two error in this (per hour vs per  
half hour), but, yes, that's the size you are talking about.


A technical issue that I have to deal with is that you get a 30  
minute show (actually 24 minutes of content) as
30 minutes, _with the ads slots included_. To show it without ads,  
you actually have to take the show into a video
editor and remove the ad slots, which costs video editor time, which  
is expensive.


So question becomes whether people might be inclined to pay $1 to  
watch an adfree TV show? If they're paying $1.99 to iTunes for the  
actual download right now, they might be willing to pay $0.99 to  
watch it over VoD?


As you said, of course this would take enormous amount of time and  
effort to convince the content owners of this model. Wonder if ISPs  
would be interested at these levels, that's also a good question.




A business model I have wondered about is, take the network feed, pay  
the subscriber cost, and sell it over the Internet

as an encrypted channel _with ads_.

Would you be willing to pay $ 5 or even $ 10 per month to watch just  
one channel, as shown over the air ?

I would, and here's why.

In the USA at least, the cable companies make you pay for bundles  
to get channels you want. I have to pay for
3 bundles to get 2 channels we actually want to watch. (One of these  
bundle is apparently only sold if you are already getting another,  
which we don't actually care about.) So, it actually costs us $ 40  
+ / month to get the two channels we want (plus a bunch we don't.)  
So, it occurs to me that there is a business selling solo channels on  
the Internet, as is, with the ads, for order $ 5 - $ 10 per  
subscriber per month, which should leave a substantial profit after  
the payments to the networks and bandwidth costs.



--
Mikael Abrahamssonemail: [EMAIL PROTECTED]


Regards
Marshall


Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-13 Thread Mikael Abrahamsson


On Sat, 13 Jan 2007, Marshall Eubanks wrote:

A technical issue that I have to deal with is that you get a 30 minute 
show (actually 24 minutes of content) as 30 minutes, _with the ads slots 
included_. To show it without ads, you actually have to take the show 
into a video editor and remove the ad slots, which costs video editor 
time, which is expensive.


Well, in this case you'd hopefully get the show directly from whoever is 
producing it without ads in the first place, basically the same content 
you might see if you buy the show on DVD.


In the USA at least, the cable companies make you pay for bundles to 
get channels you want. I have to pay for 3 bundles to get 2 channels we 
actually want to watch. (One of these bundle is apparently only sold if 
you are already getting another, which we don't actually care about.) 
So, it actually costs us $ 40 + / month to get the two channels we want 
(plus a bunch we don't.) So, it occurs to me that there is a business 
selling solo channels on the Internet, as is, with the ads, for order $ 
5 - $ 10 per subscriber per month, which should leave a substantial 
profit after the payments to the networks and bandwidth costs.


There is zero problem for the cable companies to immediately compete with 
you by offering the same thing, as soon as there is competition. Since 
their channel is the most established, my guess is that you would have a 
hard time succeeding where they already have a footprint and established 
customers.


Where you could do well with your proposal, is where there is no cable TV 
available at all.


--
Mikael Abrahamssonemail: [EMAIL PROTECTED]


Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-13 Thread Marshall Eubanks



On Jan 13, 2007, at 7:36 AM, Mikael Abrahamsson wrote:



On Sat, 13 Jan 2007, Marshall Eubanks wrote:

A technical issue that I have to deal with is that you get a 30  
minute show (actually 24 minutes of content) as 30 minutes, _with  
the ads slots included_. To show it without ads, you actually have  
to take the show into a video editor and remove the ad slots,  
which costs video editor time, which is expensive.


Well, in this case you'd hopefully get the show directly from  
whoever is producing it without ads in the first place, basically  
the same content you might see if you buy the show on DVD.




I do get it from the producer; that is what they produce. (And the  
video editor time referred to is people time, not machine time, which  
is trivial.)


In the USA at least, the cable companies make you pay for  
bundles to get channels you want. I have to pay for 3 bundles to  
get 2 channels we actually want to watch. (One of these bundle is  
apparently only sold if you are already getting another, which we  
don't actually care about.) So, it actually costs us $ 40 + /  
month to get the two channels we want (plus a bunch we don't.) So,  
it occurs to me that there is a business selling solo channels on  
the Internet, as is, with the ads, for order $ 5 - $ 10 per  
subscriber per month, which should leave a substantial profit  
after the payments to the networks and bandwidth costs.


There is zero problem for the cable companies to immediately  
compete with you by offering the same thing, as soon as there is  
competition. Since their channel is the most established, my guess  
is that you would have a hard time succeeding where they already  
have a footprint and established customers.


Yes, and that has the potential of immediately reducing their income  
by a factor of 2 or more.


I suspect that they would compete at first by putting pressure on the
channel aggregators not to sell to such businesses. (note : this is  
NOT a business I am pursuing at present.)


What I do conclude from this is that the oncoming wave of IPTV and  
Internet Television is going to be very disruptive.


Where you could do well with your proposal, is where there is no  
cable TV available at all.


Regards



--
Mikael Abrahamssonemail: [EMAIL PROTECTED]




Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-13 Thread Gian Constantine
The cable companies have been chomping at the bit for unbundled  
channels for years, so have consumers. The content providers will  
never let it happen. Their claim is the popular channels support the  
diversity of not-so-popular channels. Apparently, production costs  
are high all around (not surprising) and most channels do not support  
themselves entirely.


The MSOs have had a la carte on their Santa wish list for years and  
the content providers do not believe in Santa Claus. :-) They believe  
in Benjamin Franklin...lots and lots of Benjamin Franklin.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.

On Jan 13, 2007, at 7:14 AM, Marshall Eubanks wrote:

In the USA at least, the cable companies make you pay for bundles  
to get channels you want. I have to pay for
3 bundles to get 2 channels we actually want to watch. (One of  
these bundle is apparently only sold if you are already getting  
another, which we don't actually care about.) So, it actually costs  
us $ 40 + / month to get the two channels we want (plus a bunch we  
don't.) So, it occurs to me that there is a business selling solo  
channels on the Internet, as is, with the ads, for order $ 5 - $ 10  
per subscriber per month, which should leave a substantial profit  
after the payments to the networks and bandwidth costs.




Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-13 Thread Andrew Odlyzko

Extensive evidence of the phenomenon Mike describes (inexpensive,
frequently used things moving towards flat rate, expensive and
rare ones towards sophisticated schemes a la Saturday night
stop-over fares) is presented in my paper nternet pricing and 
the history of communications, Computer Networks 36 (2001), 
pp. 493-517, available at

  http://www.dtc.umn.edu/~odlyzko/doc/history.communications1b.pdf

It also explains some of the mechanisms behind this tendency, drawn
both from conventional economics (bundling, etc.) and behavioral
economics (willingness to pay more for flat rates).

This tendency can indeed reverse in cases of extreme asymmetry of
usage.  But one has to be careful there.  Heavy users are often
the most valuable.  (In today's environment they are often the
ones who provide the P2P material that attracts other uses to the
network.  And yes, there is a problem there, in that you don't
need such heavy users to be on YOUR network for them to be an
attraction in signing up new subscribers.)

Andrew




   On Sat Jan 13, Mike Leber wrote:

  On Sat, 13 Jan 2007, Sean Donelan wrote:
   On Fri, 12 Jan 2007, Stephen Sprunk wrote:
There is no technical challenge here; what the pirates are already doing 
works pretty well, and with a little UI work it'd even be ready for the 
mass 
market.  The challenges are figuring out how to pay for the pipes needed 
to 
deliver all these bits at consumer rates, and how to collect revenue from 
all 
the viewers to fairly compensate the producers -- both business problems, 
though for different folks.
   
   Will the North American market change from using speed to volume for 
   pricing Internet connections?  Web hosting and other markets around the
   world already use GB/transferred packages instead of the port speed.

  The North American market started with charging per GB transfered and went
  away from it because the drop in cost per Mbps for both circuits and
  transit made costs low enough so that providers could statistically
  multiplex their user base and offer unlimited service (unlimited for
  marketing departments is being able to offer something to 99 percent of
  your customer base, which explains all residential service clauses that
  state unlimited doesn't really mean unlimited).

  You can see this repeatedly for all sorts of products as costs have come
  down in the long view.  For example, consumer Internet dialup, long
  distance calling plans, local phone service plans, some aspects of cell
  phone service, it might be happening with online storage right now (i.e.
  google gmail/gfs and the browser plugins that let you store files in your
  gmail account).

  What might or might not be trending is a digression, the unlimited
  service is a marketing condition that seems to occur when 99 percent of
  your customer base uses less than the cost equal to the benefit of
  offering unlimited service.

  Mike.

  +- H U R R I C A N E - E L E C T R I C -+
  | Mike Leber   Direct Internet Connections   Voice 510 580 4100 |
  | Hurricane Electric Web Hosting  Colocation   Fax 510 580 4151 |
  | [EMAIL PROTECTED]   http://www.he.net |
  +---+




Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-13 Thread Andrew Odlyzko

This is the case of bundling, discussed in the paper I referenced in
the previous message,

  http://www.dtc.umn.edu/~odlyzko/doc/history.communications1b.pdf

It is impossible, at least without detailed studies, to tell what
the effect of selling individual channels would have.  Bundling
can have benefits for both consumers and producers (and that is
what the cable industry in the US claims applies to their case,
although all we can conclude for sure from their claims is that
they believe it has benefits to them).

Here is a simple example of bundling (something that has been known
in standard economics literature for about 30 years, although in
practice this has been done for thousands of years in various markets):

From what Marshall wrote, it appears that the 2 channels that he and
his family care about are worth at least $40 in total to him, and
everything else is useless.  Suppose (and this may or may not be true)
he and his family value each of these channels, call them A and B,
at $30/month and $20/month, respectively, so in principle the cable 
network could even raise their bundles' prices to a total of $50 
without losing him as a subscriber.

Now suppose that the universe of users consists just of Marshall
and Mikael, except that Mikael and his family are interested in
3 channels, the two channels A and B that Marshall cares about,
and channel C, and suppose the willingness to pay for them is
$10, $5, and $25, respectively.  If the cable company has to
price the channels separately (and let's exclude the ability to
price discriminate, namely charge different prices to Marshall
and Mikael, something that is generally excluded by local franchise
agreements), what will they do?  They will surely ask for $30
for channel A, $20 for channel B, and $25 for channel C, and will
get $50 from Marshall and $25 from Mikael, for a total of $75.
On the other hand, if all they offer is a bundle of all 3 channels
for $40/month, both Marshall and Mikael will pay $40 each for a
total of $80/month.  And note that both Marshall and Mikael will
be getting the bundle for no more (less in Marshall's case) than their 
valuations of individual components.  If $75/month is not enough to 
pay the content providers and maintain the network at a profit, the 
lack of bundling may even lead to death of the network.

Andrew

P.S.  And don't forget that having channels is already a form of
bundling, as are newspapers, ...




   On Sat Jan 13, Marshall Eubanks wrote:

  On Jan 13, 2007, at 7:36 AM, Mikael Abrahamsson wrote:

  
   On Sat, 13 Jan 2007, Marshall Eubanks wrote:
  
   A technical issue that I have to deal with is that you get a 30  
   minute show (actually 24 minutes of content) as 30 minutes, _with  
   the ads slots included_. To show it without ads, you actually have  
   to take the show into a video editor and remove the ad slots,  
   which costs video editor time, which is expensive.
  
   Well, in this case you'd hopefully get the show directly from  
   whoever is producing it without ads in the first place, basically  
   the same content you might see if you buy the show on DVD.
  

  I do get it from the producer; that is what they produce. (And the  
  video editor time referred to is people time, not machine time, which  
  is trivial.)

   In the USA at least, the cable companies make you pay for  
   bundles to get channels you want. I have to pay for 3 bundles to  
   get 2 channels we actually want to watch. (One of these bundle is  
   apparently only sold if you are already getting another, which we  
   don't actually care about.) So, it actually costs us $ 40 + /  
   month to get the two channels we want (plus a bunch we don't.) So,  
   it occurs to me that there is a business selling solo channels on  
   the Internet, as is, with the ads, for order $ 5 - $ 10 per  
   subscriber per month, which should leave a substantial profit  
   after the payments to the networks and bandwidth costs.
  
   There is zero problem for the cable companies to immediately  
   compete with you by offering the same thing, as soon as there is  
   competition. Since their channel is the most established, my guess  
   is that you would have a hard time succeeding where they already  
   have a footprint and established customers.
  
  Yes, and that has the potential of immediately reducing their income  
  by a factor of 2 or more.

  I suspect that they would compete at first by putting pressure on the
  channel aggregators not to sell to such businesses. (note : this is  
  NOT a business I am pursuing at present.)

  What I do conclude from this is that the oncoming wave of IPTV and  
  Internet Television is going to be very disruptive.

   Where you could do well with your proposal, is where there is no  
   cable TV available at all.

  Regards

  
   -- 
   Mikael Abrahamssonemail: [EMAIL PROTECTED]



Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-13 Thread Stephen Sprunk


[ Note: Please don't send MIME/HTML messages to mailing lists ]

Thus spake Gian Constantine:

The cable companies have been chomping at the bit for unbundled
channels for years, so have consumers. The content providers will
never let it happen. Their claim is the popular channels support the
diversity of not-so-popular channels. Apparently, production costs
are high all around (not surprising) and most channels do not support
themselves entirely.


Regulators too.  The city here tried forcing the MSOs to unbundle, and 
the result was that a single channel cost the same as the bundle it 
normally came in -- the content providers weren't willing to license 
them individually.  The city gave in and dropped it.


Just like the providers want to force people to pay for unpopular 
channels to subsidize the popular ones, they likewise want people to pay 
for unpopular programs to subsidize the popular ones.  Consumers, OTOH, 
want to buy _programs_, not _channels_.  Hollywood isn't dumb enough to 
fall for that, since they know 90% (okay, that's being conservative) of 
what they produce is crap and the only way to get people to pay for it 
is to jack up the price of the 10% that isn't crap and give the other 
90% away.


Of course, the logical solution is to quit producing crap so that such 
games aren't necessary, but since when has any MPAA or RIAA member 
decided to go that route?


S

Stephen Sprunk God does not play dice.  --Albert Einstein
CCIE #3723 God is an inveterate gambler, and He throws the
K5SSSdice at every possible opportunity. --Stephen Hawking 





Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-13 Thread Roland Dobbins



On Jan 13, 2007, at 3:01 PM, Stephen Sprunk wrote:


Consumers, OTOH, want to buy _programs_, not _channels_.


This is a very important point - perceived disintermediation,  
perceived unbundling, ad reduction/elimination, and timeshifting are  
the main reasons that DVRs are so popular (and now, placeshifting  
with things like Slingbox and Tivo2Go, though it's very early days in  
that regard).  So, at least on the face of it, there appears to be a  
high degree of congruence between the things which make DVRs  
attractive and things which make P2P attractive.


As to an earlier comment about video editing in order to remove ads,  
this is apparently the norm in the world of people who are heavy  
uploaders/crossloaders of video content via P2P systems.  It seems  
there are different 'crews' who compete to produce a 'quality  
product' in terms of the quality of the encoding, compression,  
bundling/remixing, etc.; it's very reminiscent of the 'warez' scene  
in that regard.


I believe that many of the people engaged in the above process do so  
because it's become a point of pride with them in the social circles  
they inhabit, again a la the warez community.  It's an interesting  
question as to whether or not the energy and 'professional pride' of  
this group of people could somehow be harnessed in order to provide  
and distribute content legally (as almost all of what people really  
want seems to be infringing content under the current standard  
model), and monetized so that they receive compensation and  
essentially act as the packaging and distribution arm for content  
providers willing to try such a model.  A related question is just  
how important the perceived social cachet of editing/rebundling/ 
redistributing -infringing- content is to them, and whether  
normalizing this behavior from a legal standpoint would increase or  
decrease the motivation of the 'crews' to continue providing these  
services in a legitimized commercial environment.


As a side note, it seems there's a growing phenomenon of 'upload  
cheating' taking place in the BitTorrent space, with clients such as  
BitTyrant and BitThief becoming more and more popular while at the  
same time disrupting the distribution economies of P2P networks.   
This has caused a great deal of consternation in the infringing- 
oriented P2P community of interest, with the developers/operators of  
various BitTorrent-type systems such as BitComet working at  
developing methods of detecting and blocking downloading from users  
who 'cheat' in this fashion; it is instructive (and more than a  
little ironic) to watch as various elements within the infringing- 
oriented P2P community attempt to outwit and police one another's  
behavior, especially when compared/contrasted with the same classes  
of ongoing conflict between the infringing-oriented P2P community,  
content producers, and SPs.


---

Roland Dobbins [EMAIL PROTECTED] // 408.527.6376 voice

Technology is legislation.

-- Karl Schroeder






RE: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-12 Thread Frank Bulk

If we're becoming a VOD world, does multicast play any practical role in
video distribution?

Frank

-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of
Michal Krsek
Sent: Wednesday, January 10, 2007 2:28 AM
To: Marshall Eubanks
Cc: nanog@merit.edu
Subject: Re: Network end users to pull down 2 gigabytes a day, continuously?


Hi Marshall,

 - the largest channel has 1.8% of the audience
 - 50% of the audience is in the largest 2700 channels
 - the least watched channel has ~ 10 simultaneous viewers
 - the multicast bandwidth usage would be 3% of the unicast.

I'm a bit skeptic for future of channels. For making money from the long 
tail, you have to have to adapt your distribution to user's needs. It is not

only format, codec ... but also time frame. You can organise your programs 
in channels, but they will not run simultaneously for all the users. I want 
to control my TV, I don't want to my TV jockey my life.

For the distribution, you as content owner have to help the ISP find the 
right way to distribute your content. In example: having distribution center

in Tier1 ISP network will make money from Tier2 ISP connected directly to 
Tier1. Probably, having CDN (your own or pay for service) will be the only 
one way for large scale non synchronous programing.

Regards
Michal




RE: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-12 Thread Frank Bulk

You mean the NCTC?  Yes, they did close their doors for new membership, but
there are regional head ends that represent a larger number of ITCs that
have been able to directly negotiate with the content providers.  

And then there's the turnkey vendors: IPTV Americas, SES Americom' IP-PRIME,
and Falcon Communications.

It's not entirely impossible.

Frank



From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Gian
Constantine
Sent: Wednesday, January 10, 2007 7:47 AM
To: [EMAIL PROTECTED]
Cc: Marshall Eubanks; nanog@merit.edu
Subject: Re: Network end users to pull down 2 gigabytes a day, continuously?


Many of the small carriers, who are doing IPTV in the U.S., have acquired
their content rights through a consortium, which has since closed its doors
to new membership. 

I cannot stress this enough: content is the key to a good industry-changing
business model. Broad appeal content will gain broad interest. Broad
interest will change the playing field and compel content providers to
consider alternative consumption/delivery models.

The ILECs are going to do it. They have deep pockets. Look at how quickly
they were able to get franchising laws adjusted to allow them to offer
video. 

Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.




Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-12 Thread Gian Constantine

Yes, the NCTC.

I have spoken with two of the vendors you mentioned. Neither have  
pass-through licensing rights. I still have to go directly to most of  
the content providers to get the proper licensing rights.


There are a few vendors out there who will help a company attain  
these rights, but the solution is not turnkey on licensing. To be  
clear, it is not turnkey for the major U.S. content providers.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.


On Jan 12, 2007, at 10:14 AM, Frank Bulk wrote:

You mean the NCTC?  Yes, they did close their doors for new  
membership, but
there are regional head ends that represent a larger number of ITCs  
that

have been able to directly negotiate with the content providers.

And then there's the turnkey vendors: IPTV Americas, SES Americom'  
IP-PRIME,

and Falcon Communications.

It's not entirely impossible.

Frank



From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On  
Behalf Of Gian

Constantine
Sent: Wednesday, January 10, 2007 7:47 AM
To: [EMAIL PROTECTED]
Cc: Marshall Eubanks; nanog@merit.edu
Subject: Re: Network end users to pull down 2 gigabytes a day,  
continuously?



Many of the small carriers, who are doing IPTV in the U.S., have  
acquired
their content rights through a consortium, which has since closed  
its doors

to new membership.

I cannot stress this enough: content is the key to a good industry- 
changing

business model. Broad appeal content will gain broad interest. Broad
interest will change the playing field and compel content providers to
consider alternative consumption/delivery models.

The ILECs are going to do it. They have deep pockets. Look at how  
quickly

they were able to get franchising laws adjusted to allow them to offer
video.

Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.






Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-12 Thread Gian Constantine
I am pretty sure we are not becoming a VoD world. Linear programming  
is much better for advertisers. I do not think content providers, nor  
consumers, would prefer a VoD only service. A handful of consumers  
would love it, but many would not.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.


On Jan 12, 2007, at 10:05 AM, Frank Bulk wrote:



If we're becoming a VOD world, does multicast play any practical  
role in

video distribution?

Frank

-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On  
Behalf Of

Michal Krsek
Sent: Wednesday, January 10, 2007 2:28 AM
To: Marshall Eubanks
Cc: nanog@merit.edu
Subject: Re: Network end users to pull down 2 gigabytes a day,  
continuously?



Hi Marshall,


- the largest channel has 1.8% of the audience
- 50% of the audience is in the largest 2700 channels
- the least watched channel has ~ 10 simultaneous viewers
- the multicast bandwidth usage would be 3% of the unicast.


I'm a bit skeptic for future of channels. For making money from the  
long
tail, you have to have to adapt your distribution to user's needs.  
It is not


only format, codec ... but also time frame. You can organise your  
programs
in channels, but they will not run simultaneously for all the  
users. I want

to control my TV, I don't want to my TV jockey my life.

For the distribution, you as content owner have to help the ISP  
find the
right way to distribute your content. In example: having  
distribution center


in Tier1 ISP network will make money from Tier2 ISP connected  
directly to
Tier1. Probably, having CDN (your own or pay for service) will be  
the only

one way for large scale non synchronous programing.

Regards
Michal






RE: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-12 Thread Frank Bulk
Gian:
 
I ahven't spoken to any of those turnkey providers.  Sounds like just the
hardware, plant infrastructure, and transport is turnkey. =)
 
Getting content rights is a [EMAIL PROTECTED]  That and the associated price 
tag is
probably the largest non-technical barrier to IP TV deployments today.
 
Frank

  _  

From: Gian Constantine [mailto:[EMAIL PROTECTED] 
Sent: Friday, January 12, 2007 9:24 AM
To: [EMAIL PROTECTED]
Cc: [EMAIL PROTECTED]; Marshall Eubanks; nanog@merit.edu
Subject: Re: Network end users to pull down 2 gigabytes a day, continuously?


Yes, the NCTC. 

I have spoken with two of the vendors you mentioned. Neither have
pass-through licensing rights. I still have to go directly to most of the
content providers to get the proper licensing rights.

There are a few vendors out there who will help a company attain these
rights, but the solution is not turnkey on licensing. To be clear, it is not
turnkey for the major U.S. content providers.

Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.


On Jan 12, 2007, at 10:14 AM, Frank Bulk wrote:


You mean the NCTC?  Yes, they did close their doors for new membership, but
there are regional head ends that represent a larger number of ITCs that
have been able to directly negotiate with the content providers.  

And then there's the turnkey vendors: IPTV Americas, SES Americom' IP-PRIME,
and Falcon Communications.

It's not entirely impossible.

Frank



From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Gian
Constantine
Sent: Wednesday, January 10, 2007 7:47 AM
To: [EMAIL PROTECTED]
Cc: Marshall Eubanks; nanog@merit.edu
Subject: Re: Network end users to pull down 2 gigabytes a day, continuously?


Many of the small carriers, who are doing IPTV in the U.S., have acquired
their content rights through a consortium, which has since closed its doors
to new membership. 

I cannot stress this enough: content is the key to a good industry-changing
business model. Broad appeal content will gain broad interest. Broad
interest will change the playing field and compel content providers to
consider alternative consumption/delivery models.

The ILECs are going to do it. They have deep pockets. Look at how quickly
they were able to get franchising laws adjusted to allow them to offer
video. 

Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.






Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-12 Thread Michal Krsek
Dear Gian,
from my perspecitve (central europe) it looks like the linear programming is 
used only in TV/radio channels. But this is only a part of the media industry. 
Cinema, DVD and other forms of content distribution aren't linear. I don't like 
to waste Internet capacity with URLs to large VoD community servers.

I don't have enough speaking power to write any strict statements, but I think 
the world of media industry will use every existing channel of revenue. The 
question isn't if, but when. Some people prefer having their eleven button 
remote, but some want to consume content they had chosen at time they had 
chosen. May be I'm wrong but I don't know anybody from teen generation who 
likes to be TV channel driven (may be I'm in a bad country :-)).

Regards
Michal

  - Original Message - 
  From: Gian Constantine 
  To: [EMAIL PROTECTED] 
  Cc: nanog@merit.edu 
  Sent: Friday, January 12, 2007 4:26 PM
  Subject: Re: Network end users to pull down 2 gigabytes a day, continuously?


  I am pretty sure we are not becoming a VoD world. Linear programming is much 
better for advertisers. I do not think content providers, nor consumers, would 
prefer a VoD only service. A handful of consumers would love it, but many would 
not.


  Gian Anthony Constantine
  Senior Network Design Engineer
  Earthlink, Inc.




  On Jan 12, 2007, at 10:05 AM, Frank Bulk wrote:




If we're becoming a VOD world, does multicast play any practical role in
video distribution?


Frank


-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of
Michal Krsek
Sent: Wednesday, January 10, 2007 2:28 AM
To: Marshall Eubanks
Cc: nanog@merit.edu
Subject: Re: Network end users to pull down 2 gigabytes a day, continuously?




Hi Marshall,


  - the largest channel has 1.8% of the audience
  - 50% of the audience is in the largest 2700 channels
  - the least watched channel has ~ 10 simultaneous viewers
  - the multicast bandwidth usage would be 3% of the unicast.


I'm a bit skeptic for future of channels. For making money from the long 
tail, you have to have to adapt your distribution to user's needs. It is not


only format, codec ... but also time frame. You can organise your programs 
in channels, but they will not run simultaneously for all the users. I want 
to control my TV, I don't want to my TV jockey my life.


For the distribution, you as content owner have to help the ISP find the 
right way to distribute your content. In example: having distribution center


in Tier1 ISP network will make money from Tier2 ISP connected directly to 
Tier1. Probably, having CDN (your own or pay for service) will be the only 
one way for large scale non synchronous programing.


Regards
Michal







Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-12 Thread Michael Painter


- Original Message - 
From: Gian Constantine

Sent: Friday, January 12, 2007 5:24 AM
Subject: Re: Network end users to pull down 2 gigabytes a day, continuously?


Yes, the NCTC.
I have spoken with two of the vendors you mentioned. Neither have pass-through licensing rights. I still have to go directly to most 
of the content providers to get the proper licensing rights.
There are a few vendors out there who will help a company attain these rights, but the solution is not turnkey on licensing. To be 
clear, it is not turnkey for the major U.S. content providers.


Back in the 'day', these folks were great to work with, but I have no idea of how they 
would deal with IPTV.
http://www.4com.com/Company-Profile.html

Btw, I thought VoD was one of the main drivers of IPTV, at the local level at 
least.

--Michael





Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-12 Thread Gian Constantine
I have spoken with a colleague in the industry regarding 4com.  
Apparently, they have been able to acquire some sort of pass-through  
licensing on much of the content, but I have not spoken directly with  
4com. I heard the same of Broadstream and SES Americom, but both  
proved to be more of an aid in acquisition, and not outright pass- 
through rights.


VoD is one of the main drivers, along with HD, but neither are a full- 
service alone. Consumers will demand linear programming. They have  
become accustomed to it. More importantly, the advertisers have  
become accustomed to it.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.

On Jan 12, 2007, at 5:29 PM, Michael Painter wrote:


- Original Message - From: Gian Constantine
Sent: Friday, January 12, 2007 5:24 AM
Subject: Re: Network end users to pull down 2 gigabytes a day,  
continuously?



Yes, the NCTC.
I have spoken with two of the vendors you mentioned. Neither have  
pass-through licensing rights. I still have to go directly to most  
of the content providers to get the proper licensing rights.
There are a few vendors out there who will help a company attain  
these rights, but the solution is not turnkey on licensing. To be  
clear, it is not turnkey for the major U.S. content providers.


Back in the 'day', these folks were great to work with, but I have  
no idea of how they would deal with IPTV.

http://www.4com.com/Company-Profile.html

Btw, I thought VoD was one of the main drivers of IPTV, at the  
local level at least.


--Michael







Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-12 Thread Marshall Eubanks



On Jan 12, 2007, at 10:05 AM, Frank Bulk wrote:



If we're becoming a VOD world, does multicast play any practical  
role in

video distribution?


Not to end users.

I think multicast is used a fair amount for precaching; presumably  
that would increase in this scenario.


Regards
Marshall

P.S. Of course, I do not agree we are moving to a pure VOD world. I  
agree with Michal Krsek in this regard.




Frank

-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On  
Behalf Of

Michal Krsek
Sent: Wednesday, January 10, 2007 2:28 AM
To: Marshall Eubanks
Cc: nanog@merit.edu
Subject: Re: Network end users to pull down 2 gigabytes a day,  
continuously?



Hi Marshall,


- the largest channel has 1.8% of the audience
- 50% of the audience is in the largest 2700 channels
- the least watched channel has ~ 10 simultaneous viewers
- the multicast bandwidth usage would be 3% of the unicast.


I'm a bit skeptic for future of channels. For making money from the  
long
tail, you have to have to adapt your distribution to user's needs.  
It is not


only format, codec ... but also time frame. You can organise your  
programs
in channels, but they will not run simultaneously for all the  
users. I want

to control my TV, I don't want to my TV jockey my life.

For the distribution, you as content owner have to help the ISP  
find the
right way to distribute your content. In example: having  
distribution center


in Tier1 ISP network will make money from Tier2 ISP connected  
directly to
Tier1. Probably, having CDN (your own or pay for service) will be  
the only

one way for large scale non synchronous programing.

Regards
Michal






Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-12 Thread Steve Sobol

On Sat, 13 Jan 2007, Mikael Abrahamsson wrote:

 My experience is that when you show people VoD, they like it. 

I have to admit the wow factor is there. But I already have access to VoD 
through my cable company and its set-top boxes. TV over IP brings my 
family exactly zero additional benefits.

-- 
Steve Sobol, Professional Geek ** Java/VB/VC/PHP/Perl ** Linux/*BSD/Windows
Victorville, California PGP:0xE3AE35ED

It's all fun and games until someone starts a bonfire in the living room.



Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-11 Thread Stephen Sprunk


Thus spake Marshall Eubanks [EMAIL PROTECTED]

On Jan 10, 2007, at 11:19 PM, Thomas Leavitt wrote:
I don't think consumers are going to accept having to wait for a 
scheduled broadcast of whatever piece of video content they want 
to view - at least if the alternative is being able to download and 
watch it nearly


That's the pull model. The push model will also exist. Both will make 
money.


There's a severe Layer 8 problem, though, because most businesses seem 
to pursue only one delivery strategy, instead of viewing them as 
complementary and using _all_ of them as appropriate.


When IP STBs start appearing, most of them _should_ have some sort of 
feature to subscribe to certain programs.  That means when a program is 
released for distribution, there will be millions of people waiting for 
it.  Push it out via mcast or P2P at 3am and it'll be waiting for them 
when they wake up (or 3pm, ready when they come home from work).  Folks 
who want older programs would need to select a show and the STB would 
grab it via P2P or pull methods.


Mcast has the advantage that STBs could opportunistically cache all 
recent content in case the user wants to browse the latest programs 
they haven't subscribed to, aka channel surfing.  This doesn't make 
sense with P2P due to the the waste of bandwidth, and it's not very 
effective with pull content because most folks still can't get a high 
enough bitrate from some distant colo into their homes to pull content 
as fast as they consume it.


The TV pirates have figured most of this out.  Most BitTorrent clients 
these days support RSS feeds, and there are dozens of sites that will 
give you a feed for particular shows (at least those popular enough to 
be pirated) so that your client will start pulling it as soon as it hits 
the 'net; shows like 24 will have _tens of thousands_ of clients 
downloading a new episode within minutes.  Likewise, the same sites 
offer catalogs going back several years so that you can pick nearly any 
episode and watch it within a couple hours.  Mcast is the one piece 
missing, but perhaps if it's not being used that's just yet another sign 
it's a solution in search of a problem, as critics have been saying for 
the last decade?


There is no technical challenge here; what the pirates are already doing 
works pretty well, and with a little UI work it'd even be ready for the 
mass market.  The challenges are figuring out how to pay for the pipes 
needed to deliver all these bits at consumer rates, and how to collect 
revenue from all the viewers to fairly compensate the producers -- both 
business problems, though for different folks.  Interesting problems to 
solve, but NANOG probably isn't the appropriate forum.


S

Stephen Sprunk God does not play dice.  --Albert Einstein
CCIE #3723 God is an inveterate gambler, and He throws the
K5SSSdice at every possible opportunity. --Stephen Hawking 





Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-10 Thread Frank Coluccio
 outside of a closed network.
Don't forget. Even the titles you mentioned are still owned by very large
companies interested in squeezing every possible dime from their assets. They
would not be cheap to acquire.
Further, torrent-like distribution is a long long way away from sign off by the
content providers. They see torrents as the number one tool of content piracy.
This is a major reason I see the discussion of tripping upstream usage limits
through content distribution as moot.
I am with you on the vision of massive content libraries at the fingertips of
all, but I see many roadblocks in the way. And, almost none of them are 
technical
in nature.
Gian Anthony ConstantineSenior Network Design EngineerEarthlink, Inc.Office:
404-748-6207Cell: 404-808-4651Internal Ext:
[EMAIL PROTECTED] mailto:[EMAIL PROTECTED]


On Jan 8, 2007, at 7:51 PM, Bora Akyol wrote:
 
Please see my comments inline:
 -Original Message-From: Gian Constantine
[mailto:[EMAIL PROTECTED] Sent: Monday, January 08, 2007 4:27
PMTo: Bora AkyolCc: nanog@merit.edu mailto:nanog@merit.eduSubject: Re: Network
end users to pull down 2 gigabytes a day, continuously?
 snip 
I would also argue storage and distribution costs are not asymptotically zero
with scale. Well designed SANs are not cheap. Well designed distribution systems
are not cheap. While price does decrease when scaled upwards, the cost of such 
an
operation remains hefty, and increases with additions to the offered content
library and a swelling of demand for this content. I believe the graph becomes
neither asymptotic, nor anywhere near zero. 
To the end user, there is no cost to downloading videos when they aresleeping.I
would argue that other than sports (and some news) events, there ispretty much 
no
content thatneeds to be real time. What the downloading (possibly 24x7) does is
to stress the ISP network to its max since the assumptions of
statisticalmultiplexinggoes out the window. Think of a Tivo that downloads
content off theInternet24x7. The user is still paying for only what they pay 
each
month, and this isnetwork neutrality 2.0 all over again.

 You are correct on the long tail nature of music. But music is not consumed in
a similar manner as TV and movies. Television and movies involve a little more
commitment and attention. Music is more for the moment and the mood. There is an
immediacy with music consumption. Movies and television require a slight degree
more patience from the consumer. The freshness (debatable :-) ) of new release
movies and TV can often command the required patience from the consumer. Older
content rarely has the same pull. 
I would argue against your distinction between visual and auditorycontent.There
is a lot of content out there that a lot of people watch and thecontentis 20-40+
years old. Think Brady Bunch, Bonanza, or archived games fromNFL,MLB etc. What
about Smurfs (for those of us with kids)?
This is only the beginning.
If I can get a 500GB box and download MP4 content, that's a lot ofessentially
free storage.
Coming back to NANOG content, I think video (not streamed but
multi-pathdistributed video) is going to bring the networks down not by
sheerbandwidth alone but by challenging the assumptions behind theengineering of
the network. I don't think you need huge SANs per se tostore the content either,
since it is multi-source/multi-sink, thereliability is built-in.
The SPs like Verizon  ATT moving fiber to the home hoping to get in onthe
value add action are in for an awakening IMHO.
Regards
Boraps. I apologize for the tone of my previous email. That sounded 
grumpierthan
I usually am.

 
 

-- Thomas Leavitt - [EMAIL PROTECTED] - 831-295-3917 (cell)
*** Independent Systems and Network Consultant, Santa Cruz, CA ***
thomas.vcf




Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-10 Thread Michal Krsek


Hi Marshall,


- the largest channel has 1.8% of the audience
- 50% of the audience is in the largest 2700 channels
- the least watched channel has ~ 10 simultaneous viewers
- the multicast bandwidth usage would be 3% of the unicast.


I'm a bit skeptic for future of channels. For making money from the long 
tail, you have to have to adapt your distribution to user's needs. It is not 
only format, codec ... but also time frame. You can organise your programs 
in channels, but they will not run simultaneously for all the users. I want 
to control my TV, I don't want to my TV jockey my life.


For the distribution, you as content owner have to help the ISP find the 
right way to distribute your content. In example: having distribution center 
in Tier1 ISP network will make money from Tier2 ISP connected directly to 
Tier1. Probably, having CDN (your own or pay for service) will be the only 
one way for large scale non synchronous programing.


   Regards
   Michal



Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-10 Thread Michal Krsek


Hi Marshall,


- the largest channel has 1.8% of the audience
- 50% of the audience is in the largest 2700 channels
- the least watched channel has ~ 10 simultaneous viewers
- the multicast bandwidth usage would be 3% of the unicast.


I'm a bit skeptic for future of channels. For making money from the long 
tail, you have to have to adapt your distribution to user's needs. It is not 
only format, codec ... but also time frame. You can organise your programs 
in channels, but they will not run simultaneously for all the users. I want 
to control my TV, I don't want to my TV jockey my life.


For the distribution, you as content owner have to help the ISP find the 
right way to distribute your content. In example: having distribution center 
in Tier1 ISP network will make money from Tier2 ISP connected directly to 
Tier1. Probably, having CDN (your own or pay for service) will be the only 
one way for large scale non synchronous programing.


   Regards
   Michal



Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-10 Thread Richard Naylor


At 08:40 p.m. 9/01/2007 -0500, Gian Constantine wrote:
It would not be any easier. The negotiations are very complex. The issue 
is not one of infrastructure capex. It is one of jockeying between content 
providers (big media conglomerates) and the video service providers (cable 
companies).


We're seeing a degree of co-operation in this area. Its being driven by the 
market. - see below.


snip
On Jan 9, 2007, at 7:57 PM, Bora Akyol wrote:
An additional point to consider is that it takes a lot of effort and

 to get a channel allocated to your content in a cable network.

This is much easier when TV is being distributed over the Internet.


The other bigger driver, is that for most broadcasters (both TV and Radio), 
advertising revenues are flat, *except* in the on-line area. So they are 
chasing on-line growth like crazy. Typically on-line revenues now make up 
around 25% of income.


So broadcasters are reacting and developing quite large systems for 
delivering content both new and old. We're seeing these as a mixture of 
live streams, on-demand streams, on-demand downloads and torrents. 
Basically, anything that works and is reliable and can be scaled. (we 
already do geographic distribution and anycast routing).


And the broadcasters won't pay flash transit charges. They are doing this 
stuff from within existing budgets. They will put servers in different 
countries if it makes financial sense. We have servers in the USA, and 
their biggest load is non-peering NZ  based ISPs.


And broadcasters aren't the only source of large content. My estimate is 
that they are only 25% of the source. Somewhere last year I heard John 
Chambers say that many corporates are seeing 500% growth in LAN traffic - 
fueled by video.


We do outside webcasting - to give you an idea of traffic, when we get a 
fiber connex, we allow for 6GBytes per day between an encoder and the 
server network - per programme. We often produce several different 
programmes from a site in different languages etc. Each one is 6GB. If we 
don't have fiber, it scales down to about 2GB per programme. (on fiber we 
crank out a full 2Mbps Standard Def stream, on satellite we only get 2Mbps 
per link). I have a chart by my phone that gives the minute/hour/day/month 
traffic impact of a whole range of streams and refer to it every day. Oh - 
we can do 1080i on demand and can and do produce content in that format. 
They're 8Mbps streams. Not many viewers tho :-)   We're close to being able 
to webcast it live.


We currently handle 50+ radio stations and 12 TV stations, handling around 
1.5 to 2million players a month, in a country with a population of 
4million. But then my stats could be lying..


Rich
(long time lurker)




Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-10 Thread Michael . Dillon

 How many channels can you get on your (terrestrial) broadcast receiver?

There are about 30 channels broadcast free-to-air
on digital freeview in the UK. I only have so many
hours in the day so I never have a problem in finding
something. Some people are TV junkies or they only
want some specific content so they get satellite dishes.
Any Internet TV service has a limited market because
it competes head-on with free-to-air and satellite
services. And it is difficult to plug Internet TV into
your existing TV setup.

--Michael Dillon



Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-10 Thread Mikael Abrahamsson


On Tue, 9 Jan 2007, [EMAIL PROTECTED] wrote:

between handling 30K unicast streams, and 30K multicast streams that 
each have only one or at most 2-3 viewers?


My opinion on the downside of video multicast is that if you want it 
realtime your SLA figures on acceptable packet loss goes down from 
fractions of a percent into the thousands of a percent, at least with 
current implementations of video.


Imagine internet multicast and having customers complain about bad video 
quality and trying to chase down that last 1/10 packet loss that 
makes peoples video pixelate every 20-30 minutes, and the video stream 
doesn't even originate in your network?


For multicast video to be easier to implement we need more robust video 
codecs that can handle jitter and packet loss that are currently present 
in networks and handled acceptably by TCP for unicast.


--
Mikael Abrahamssonemail: [EMAIL PROTECTED]


Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-10 Thread Marshall Eubanks


On Jan 10, 2007, at 5:42 AM, Mikael Abrahamsson wrote:



On Tue, 9 Jan 2007, [EMAIL PROTECTED] wrote:

between handling 30K unicast streams, and 30K multicast streams  
that each have only one or at most 2-3 viewers?


My opinion on the downside of video multicast is that if you want  
it realtime your SLA figures on acceptable packet loss goes down  
from fractions of a percent into the thousands of a percent, at  
least with current implementations of video.




Actually, this is true with unicast as well.

This can (I think) largely be handled by a fairly moderate amount of  
Forward Error Correction.


Regards
Marshall


Imagine internet multicast and having customers complain about bad  
video quality and trying to chase down that last 1/10 packet  
loss that makes peoples video pixelate every 20-30 minutes, and the  
video stream doesn't even originate in your network?


For multicast video to be easier to implement we need more robust  
video codecs that can handle jitter and packet loss that are  
currently present in networks and handled acceptably by TCP for  
unicast.


--
Mikael Abrahamssonemail: [EMAIL PROTECTED]




Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-10 Thread Simon Lockhart

On Wed Jan 10, 2007 at 09:43:11AM +, [EMAIL PROTECTED] wrote:
 And it is difficult to plug Internet TV into your existing TV setup.

Can your average person plug a cable / satellite / terrestrial (in the UK,
the only mainstream option here for self-install is terrestrial)? Power,
TV, and antenna? Then why can't they plug in Power, TV  phone line? That's
where IPTV STBs are going...

Simon


Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-10 Thread Gian Constantine
Many of the small carriers, who are doing IPTV in the U.S., have  
acquired their content rights through a consortium, which has since  
closed its doors to new membership.


I cannot stress this enough: content is the key to a good industry- 
changing business model. Broad appeal content will gain broad  
interest. Broad interest will change the playing field and compel  
content providers to consider alternative consumption/delivery models.


The ILECs are going to do it. They have deep pockets. Look at how  
quickly they were able to get franchising laws adjusted to allow them  
to offer video.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.

On Jan 10, 2007, at 2:30 AM, Christian Kuhtz wrote:


Marshall,

I completely agree, and due diligence on business models will show  
that fact very clearly.  And nothing much has changed here in terms  
of substance over the last 4+ yrs either.  Costs and opportunities  
have changed or evolved rather, but not the mechanics.


Infrastructure capital is very much the gating factor in every  
major video distribution infrastructure (and the reason why DOCSIS  
3.0 is just such a neato thing).  The carriage deals are merely  
table stakes, and that doesn't mean they're easy.  They are  
obtainable.


And some business models are just fundamentally broken.

Examples for infrastructure costs are size of CSA's or cost  
upgrading CPE is a far bigger deal than carriage.  And if you can't  
get into RT's in a ILEC colo arrangement, that doesn't per se  
globally invalidate business models, but rather provides unique  
challenges and limitations on a given specific business model.


What has changed is that ppl are actually 'doing it'.  And that  
proves that several models are viable for funding in all sorts of  
flavors and risks.


IPTV is fundamentally subject to the analog fallacies of VoIP  
replacing 1FR/1BR service on 1:1 basis (toll arbitrage or anomalies  
aside).  There seems to be plenty of that.  A new IP service  
offering no unique features over specialzed and depreciated  
infrastructure will not be viable until commoditized and not at an  
early maturity level like where IPTV is at.


Unless an IPTV service offers a compelling cost advantage, mass  
adoption will not occur.  And any cost increase will have to be  
justifiable to consumers, and that cannot be underestimated.


But, some just continue to ignore those fundamentals and those  
business models will fail.  And we should be thankful for that self  
cleansing action of a functioning market.


Enough rambling after a long day at CES, I suppose.  Thanks for  
reading this far.


Best regards,
Christian

--
Sent from my BlackBerry.

-Original Message-
From: Marshall Eubanks [EMAIL PROTECTED]
Date: Wed, 10 Jan 2007 01:52:06
To:Gian Constantine [EMAIL PROTECTED]
Cc:Bora Akyol [EMAIL PROTECTED],Simon Lockhart  
[EMAIL PROTECTED], [EMAIL PROTECTED],nanog@merit.edu
Subject: Re: Network end users to pull down 2 gigabytes a day,  
continuously?




On Jan 9, 2007, at 8:40 PM, Gian Constantine wrote:


It would not be any easier. The negotiations are very complex. The
issue is not one of infrastructure capex. It is one of jockeying
between content providers (big media conglomerates) and the video
service providers (cable companies).


Not necessarily. Depends on your business model.

Regards
Marshall



Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.


On Jan 9, 2007, at 7:57 PM, Bora Akyol wrote:



Simon

An additional point to consider is that it takes a lot of effort and
 to get a channel allocated to your content in a cable network.

This is much easier when TV is being distributed over the Internet.



-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On
Behalf Of Simon Lockhart
Sent: Tuesday, January 09, 2007 2:42 PM
To: [EMAIL PROTECTED]
Cc: nanog@merit.edu
Subject: Re: Network end users to pull down 2 gigabytes a
day, continuously?


On Tue Jan 09, 2007 at 07:52:02AM +,
[EMAIL PROTECTED] wrote:

Given that the broadcast model for streaming content
is so successful, why would you want to use the
Internet for it? What is the benefit?


How many channels can you get on your (terrestrial) broadcast
receiver?

If you want more, your choices are satellite or cable. To get
cable, you
need to be in a cable area. To get satellite, you need to
stick a dish on
the side of your house, which you may not want to do, or may
not be allowed
to do.

With IPTV, you just need a phoneline (and be close enough to
the exchange/CO
to get decent xDSL rate). In the UK, I'm already delivering
40+ channels over
IPTV (over inter-provider multicast, to any UK ISP that wants it).

Simon












Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-10 Thread Alexander Harrowell


On 1/10/07, Simon Lockhart [EMAIL PROTECTED] wrote:


On Wed Jan 10, 2007 at 09:43:11AM +, [EMAIL PROTECTED] wrote:
 And it is difficult to plug Internet TV into your existing TV setup.

Can your average person plug a cable / satellite / terrestrial (in the UK,
the only mainstream option here for self-install is terrestrial)? Power,
TV, and antenna? Then why can't they plug in Power, TV  phone line? That's
where IPTV STBs are going...

Simon



Especially as more and more ISPs/telcos hand out WLAN boxen of various
kinds - after all, once you have some sort of Linux (usually)
networked appliance in the user's premises, it's quite simple to
deploy more services (hosted VoIP, IPTV, media centre, connected
storage, maybe SIP/Asterisk..) on top of that.

Slingbox-like features and mobile-world things like UMA are also
pushing us that way.


Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-10 Thread Gian Constantine
 is the ideal low-impact solution to content delivery. And  
absolutely, a
500GB drive would almost be overkill on space for disposable  
content encoded in
H.264. Excellent SD (480i) content can be achieved at ~1200 to  
1500kbps,
resulting in about a 1GB file for a 90 minute title. HD is almost  
out of the
question for internet download, given good 720p at ~5500kbps,  
resulting in a 30GB

file for a 90 minute title.
Service providers wishing to provide this service to their  
customers may see
some success where they control the access medium (copper loop,  
coax, FTTH).
Offering such a service to customers outside of this scope would  
prove very
expensive, and likely, would never see a return on the investment  
without
extensive peering arrangements. Even then, distribution rights  
would be very
difficult to attain without very deep pockets and crippling revenue  
sharing. The
studios really dislike the idea of transmission outside of a closed  
network.
Don't forget. Even the titles you mentioned are still owned by very  
large
companies interested in squeezing every possible dime from their  
assets. They

would not be cheap to acquire.
Further, torrent-like distribution is a long long way away from  
sign off by the
content providers. They see torrents as the number one tool of  
content piracy.
This is a major reason I see the discussion of tripping upstream  
usage limits

through content distribution as moot.
I am with you on the vision of massive content libraries at the  
fingertips of
all, but I see many roadblocks in the way. And, almost none of them  
are technical

in nature.
Gian Anthony ConstantineSenior Network Design EngineerEarthlink,  
Inc.Office:

404-748-6207Cell: 404-808-4651Internal Ext:
[EMAIL PROTECTED]  
mailto:[EMAIL PROTECTED]



On Jan 8, 2007, at 7:51 PM, Bora Akyol wrote:

Please see my comments inline:
-Original Message-From: Gian Constantine
[mailto:[EMAIL PROTECTED] Sent: Monday, January 08,  
2007 4:27
PMTo: Bora AkyolCc: nanog@merit.edu  
mailto:nanog@merit.eduSubject: Re: Network

end users to pull down 2 gigabytes a day, continuously?

snip
I would also argue storage and distribution costs are not  
asymptotically zero
with scale. Well designed SANs are not cheap. Well designed  
distribution systems
are not cheap. While price does decrease when scaled upwards, the  
cost of such an
operation remains hefty, and increases with additions to the  
offered content
library and a swelling of demand for this content. I believe the  
graph becomes

neither asymptotic, nor anywhere near zero.
To the end user, there is no cost to downloading videos when they  
aresleeping.I
would argue that other than sports (and some news) events, there  
ispretty much no
content thatneeds to be real time. What the downloading (possibly  
24x7) does is

to stress the ISP network to its max since the assumptions of
statisticalmultiplexinggoes out the window. Think of a Tivo that  
downloads
content off theInternet24x7. The user is still paying for only what  
they pay each

month, and this isnetwork neutrality 2.0 all over again.


You are correct on the long tail nature of music. But music is not  
consumed in
a similar manner as TV and movies. Television and movies involve a  
little more
commitment and attention. Music is more for the moment and the  
mood. There is an
immediacy with music consumption. Movies and television require a  
slight degree
more patience from the consumer. The freshness (debatable :-) ) of  
new release
movies and TV can often command the required patience from the  
consumer. Older

content rarely has the same pull.
I would argue against your distinction between visual and  
auditorycontent.There
is a lot of content out there that a lot of people watch and  
thecontentis 20-40+
years old. Think Brady Bunch, Bonanza, or archived games  
fromNFL,MLB etc. What

about Smurfs (for those of us with kids)?

This is only the beginning.
If I can get a 500GB box and download MP4 content, that's a lot  
ofessentially

free storage.

Coming back to NANOG content, I think video (not streamed but
multi-pathdistributed video) is going to bring the networks down  
not by
sheerbandwidth alone but by challenging the assumptions behind  
theengineering of
the network. I don't think you need huge SANs per se tostore the  
content either,

since it is multi-source/multi-sink, thereliability is built-in.
The SPs like Verizon  ATT moving fiber to the home hoping to get  
in onthe

value add action are in for an awakening IMHO.

Regards
Boraps. I apologize for the tone of my previous email. That  
sounded grumpierthan

I usually am.





-- Thomas Leavitt - [EMAIL PROTECTED] - 831-295-3917 (cell)
*** Independent Systems and Network Consultant, Santa Cruz, CA ***
thomas.vcf







Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-10 Thread Gian Constantine

All H.264?

Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.

On Jan 10, 2007, at 4:41 AM, Richard Naylor wrote:



At 08:40 p.m. 9/01/2007 -0500, Gian Constantine wrote:
It would not be any easier. The negotiations are very complex. The  
issue is not one of infrastructure capex. It is one of jockeying  
between content providers (big media conglomerates) and the video  
service providers (cable companies).


We're seeing a degree of co-operation in this area. Its being  
driven by the market. - see below.


snip
On Jan 9, 2007, at 7:57 PM, Bora Akyol wrote:
An additional point to consider is that it takes a lot of effort and

 to get a channel allocated to your content in a cable network.

This is much easier when TV is being distributed over the Internet.


The other bigger driver, is that for most broadcasters (both TV and  
Radio), advertising revenues are flat, *except* in the on-line  
area. So they are chasing on-line growth like crazy. Typically on- 
line revenues now make up around 25% of income.


So broadcasters are reacting and developing quite large systems for  
delivering content both new and old. We're seeing these as a  
mixture of live streams, on-demand streams, on-demand downloads and  
torrents. Basically, anything that works and is reliable and can be  
scaled. (we already do geographic distribution and anycast routing).


And the broadcasters won't pay flash transit charges. They are  
doing this stuff from within existing budgets. They will put  
servers in different countries if it makes financial sense. We have  
servers in the USA, and their biggest load is non-peering NZ  based  
ISPs.


And broadcasters aren't the only source of large content. My  
estimate is that they are only 25% of the source. Somewhere last  
year I heard John Chambers say that many corporates are seeing 500%  
growth in LAN traffic - fueled by video.


We do outside webcasting - to give you an idea of traffic, when we  
get a fiber connex, we allow for 6GBytes per day between an encoder  
and the server network - per programme. We often produce several  
different programmes from a site in different languages etc. Each  
one is 6GB. If we don't have fiber, it scales down to about 2GB per  
programme. (on fiber we crank out a full 2Mbps Standard Def stream,  
on satellite we only get 2Mbps per link). I have a chart by my  
phone that gives the minute/hour/day/month traffic impact of a  
whole range of streams and refer to it every day. Oh - we can do  
1080i on demand and can and do produce content in that format.  
They're 8Mbps streams. Not many viewers tho :-)   We're close to  
being able to webcast it live.


We currently handle 50+ radio stations and 12 TV stations, handling  
around 1.5 to 2million players a month, in a country with a  
population of 4million. But then my stats could be lying..


Rich
(long time lurker)






Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-10 Thread Gian Constantine
Sounds a little like low buffering and sparse I-frames, but I'm no  
MPEG expert. :-)


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.

On Jan 10, 2007, at 5:42 AM, Mikael Abrahamsson wrote:



On Tue, 9 Jan 2007, [EMAIL PROTECTED] wrote:

between handling 30K unicast streams, and 30K multicast streams  
that each have only one or at most 2-3 viewers?


My opinion on the downside of video multicast is that if you want  
it realtime your SLA figures on acceptable packet loss goes down  
from fractions of a percent into the thousands of a percent, at  
least with current implementations of video.


Imagine internet multicast and having customers complain about bad  
video quality and trying to chase down that last 1/10 packet  
loss that makes peoples video pixelate every 20-30 minutes, and the  
video stream doesn't even originate in your network?


For multicast video to be easier to implement we need more robust  
video codecs that can handle jitter and packet loss that are  
currently present in networks and handled acceptably by TCP for  
unicast.


--
Mikael Abrahamssonemail: [EMAIL PROTECTED]




Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-10 Thread Michael . Dillon

   Then why can't they plug in Power, TV  phone line? That's
  where IPTV STBs are going...

OK, I can see that you could use such a set-top box to
sell broadband to households which would not otherwise 
buy Internet services. But that is a niche market.

 Especially as more and more ISPs/telcos hand out WLAN boxen of various
 kinds - after all, once you have some sort of Linux (usually)
 networked appliance in the user's premises, it's quite simple to
 deploy more services (hosted VoIP, IPTV, media centre, connected
 storage, maybe SIP/Asterisk..) on top of that.

He didn't say that his STB had an Ethernet port.
And I'm not aware of any generic Linux box that can
be used to deploy additional services other than
do-it-yourself. And that too is a niche market.

Also, note that the proliferation of boxes, each
needing its own power connection and some place 
to sit, is causing its own problems in the household.
Stacking boxes is not straightforward because some have
air vents on top and others are not flat on top.
The TV people have not learned the lessons of
that the hi-fi component people learned back in
the 1960s.

--Michael Dillon



Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-10 Thread Alexander Harrowell


On 1/10/07, [EMAIL PROTECTED] [EMAIL PROTECTED] wrote:


   Then why can't they plug in Power, TV  phone line? That's
  where IPTV STBs are going...

OK, I can see that you could use such a set-top box to
sell broadband to households which would not otherwise
buy Internet services. But that is a niche market.

 Especially as more and more ISPs/telcos hand out WLAN boxen of various
 kinds - after all, once you have some sort of Linux (usually)
 networked appliance in the user's premises, it's quite simple to
 deploy more services (hosted VoIP, IPTV, media centre, connected
 storage, maybe SIP/Asterisk..) on top of that.

He didn't say that his STB had an Ethernet port.
And I'm not aware of any generic Linux box that can
be used to deploy additional services other than
do-it-yourself. And that too is a niche market.



For example: France Telecom's consumer ISP in France (Wanadoo) is
pushing out lots and lots of WLAN boxes to its subs, which it brands
Liveboxes. As well as the router, they also carry their carrier-VoIP
and IPTV STB functions. If they can be remotely managed, then they are
a potential platform for further services beyond that. See also 3's
jump into Slingboxes.


Also, note that the proliferation of boxes, each
needing its own power connection and some place
to sit, is causing its own problems in the household.
Stacking boxes is not straightforward because some have
air vents on top and others are not flat on top.
The TV people have not learned the lessons of
that the hi-fi component people learned back in
the 1960s.



Analogous to the question of whether digicams, iPods etc will
eventually be absorbed by mobile devices. Will convergence on IP,
which tends towards concentration of functions on a common box,
outpace the creation of new boxes? CES this year saw a positive rash
of home server products.


Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-10 Thread Andre Oppermann


Alexander Harrowell wrote:

Analogous to the question of whether digicams, iPods etc will
eventually be absorbed by mobile devices.


I guess eventually it will go the other way around as well.  I was
very surprised not to see Steve Jobs announce an iPod Nano-Phone.
A iPod Nano with bare-bone GSM functionality as provided by one
of the recent single-chip developments from TI and SiLabs AeroFon.
Would fit nicely and cover 85% of all use cases, that is voice and
SMS.  True mass-market.  Pop in your SIM and you're ready to rock.
A slightly enhanced click-wheel would make a nice input device too
(and no, do not emulate a rotary phone).  All together would cost
only $15 more than the base iPod.  GSM single chip is really cheap.

Yeah, I'm a dreamer.

--
Andre


Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-10 Thread Sam Stickland


Will Hargrave wrote:

[EMAIL PROTECTED] wrote:

  

I have to admit that I have no idea how BT charges
ISPs for wholesale ADSL. If there is indeed some kind
of metered charging then Internet video will be a big
problem for the business model. 


They vary, it depends on what pricing model has been selected.

http://tinyurl.com/yjgsum has BT Central pipe pricing. Note those are
prices, not telephone numbers. ;-)

If you convert into per-megabit charges - at least an order of magnitude
greater than the cost of transit, and at least a couple of orders of
magnitude more than peering/partial transit.
  
A cursory look at the document doesn't seem to show any prices above 
622Mbps, but for that you're looking at about £160,000 a year or 
£21/Mbps/month.


2GB per day, equates to 190Kbps (assuming a perfectly even distribution 
pattern, which of course would never happen), which would be £3.98 a 
month per user. In reality I imagine that you could see usage peaking at 
about 3 times the average, or considerably greater if large flash crowd 
events occur.


I would say that in the UK market today, those sorts of figures are 
enough to destroy current margins, but certainly not high enough that 
the costs couldn't be passed onto the end user as part of an Internet 
TV package.

p2p is no panacea to get around these charges; in the worst case p2p
traffic will just transit your central pipe twice, which means the
situation is worse with p2p not better.

For a smaller UK ISP, I do not know if there is a credible wholesale LLU
alternative to BT.
  
Both Bulldog (CW) and Easynet sell wholesale LLU via an L2TP handoff. 
It's been a while since I was in that game so any prices I have will be 
out of date by now, but IIRC both had the option to pay them per line 
_or_ for a central pipe style model. The per line prices were just about 
low enough to remain competitve, with the central pipe being cheaper for 
volume (but of course, only because you'd currently need to buy far less 
bandwidth than the total of all the lines in use; most ASDL users 
consume a surprisingly small amount of bandwidth and they aggregate very 
well).

Note this information is of course completely UK-centric. A more
regionalised model (21CN?!) would change the situation.

Will

  

S


Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-10 Thread Petri Helenius


Marshall Eubanks wrote:

Actually, this is true with unicast as well.

This can (I think) largely be handled by a fairly moderate amount of 
Forward Error Correction.


Regards
Marshall
Before streaming meant HTTP-like protocols over port 80 and UDP was 
actually used, we did some experiments with FEC and discovered that 
reasonable interleaving (so that two consequtive packets lost could be 
recovered) and 1:10 FEC resulted in zero-loss environment in all cases 
we tested.


Pete



Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-10 Thread Simon Leinen

Alexander Harrowell writes:
 For example: France Telecom's consumer ISP in France (Wanadoo) is
 pushing out lots and lots of WLAN boxes to its subs, which it brands
 Liveboxes. As well as the router, they also carry their carrier-VoIP
 and IPTV STB functions. [...]

Right, and the French ADSL ecosystem mostly seems to be based on these
boxes - Proxad/free.fr has its Freebox, Alice ADSL (Telecom Italia)
the AliceBox, etc.  All these have SCART (peritelevision) TV plugs
in their current incarnations, in addition to the WLAN access points
and phone jacks that previous versions already had.

Personally I don't like this kind of bundling, and I think being able
to choose telephony and video providers indepenently of ISP is better.
But the business model seems to work in that market.  Note that I
don't have any insight or numbers, just noticing that non-technical
people (friends and family in France) do seem to be capable of
receiving TV over IP (although not over the Internet) - confirming
what Simon Lockhart claimed.

Of course there are still technical issues such as how to connect two
TV sets in different parts of an appartment to a single *box.  (Some
boxes do support two simultaneous video channels depending on
available bandwidth, which is based on the level of unbundling
(degroupage) in the area.)

As far as I know, the French ISPs use IP multicast for video
distribution, although I'm pretty sure that these IP multicast
networks are not connected to each other or to the rest of the
multicast Internet.
-- 
Simon.


Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-10 Thread Richard Naylor


At 08:58 a.m. 10/01/2007 -0500, Gian Constantine wrote:

All H.264?


no - H.264 is only the free stuff. Pretty well its all WindowsMedia - 
because of the DRM capabilities. The rights holders are insisting on that.


No DRM = no content. (from the big content houses)

The advantage of WM DRM is that smaller players can add DRM to their 
content quite easily and these folks want to be able to control that space. 
Even when they are part of an International conglomerate, each country 
subsidiary seems to get non-DRM'ed material and repackage it (ie add DRM). 
I understand this is how folks like Sony dish out the rights - on a country 
basis, so each subsidiary gets to define the business rights (ie play 
rights) in their own country space. WM DRM has all of this well defined.


Rich






Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-10 Thread Thomas Leavitt
It seems to me that multi-cast is a technical solution for the bandwidth 
consumption problems precipitated by real-time Internet video broadcast, 
but it doesn't seem to me that the bulk of current (or even future) 
Internet video traffic is going to be amenable to distribution via 
multi-cast - or, at least, separate and apart from whatever happens with 
multi-cast, a huge and growing volume of video traffic will be flowing 
over the 'net...


I don't think consumers are going to accept having to wait for a 
scheduled broadcast of whatever piece of video content they want to 
view - at least if the alternative is being able to download and watch 
it nearly immediately. That said, for the most popular content with the 
widest audience, scheduled multi-cast makes sense... especially when the 
alternative is waiting for a large download to finish - contrawise, it 
doesn't seem reasonable to be constantly multi-casting *every* piece of 
video content anyone might ever want to watch (that in itself would 
consume an insane amount of bandwidth). How many pieces of video content 
are there on YouTube? How many more can we expect to emerge over the 
next decade, given the ever decreasing cost of entry for reasonably 
decent video production?


All of which, to me, leaves the fundamental issue of how the upsurge in 
traffic is going to be handled left unresolved.


Thomas

Simon Lockhart wrote:

On Tue Jan 09, 2007 at 07:52:02AM +, [EMAIL PROTECTED] wrote:
  

Given that the broadcast model for streaming content
is so successful, why would you want to use the
Internet for it? What is the benefit?



How many channels can you get on your (terrestrial) broadcast receiver?

If you want more, your choices are satellite or cable. To get cable, you 
need to be in a cable area. To get satellite, you need to stick a dish on 
the side of your house, which you may not want to do, or may not be allowed

to do.

With IPTV, you just need a phoneline (and be close enough to the exchange/CO
to get decent xDSL rate). In the UK, I'm already delivering 40+ channels over
IPTV (over inter-provider multicast, to any UK ISP that wants it).

Simon
  



--
Thomas Leavitt - [EMAIL PROTECTED] - 831-295-3917 (cell)

*** Independent Systems and Network Consultant, Santa Cruz, CA ***

begin:vcard
fn:Thomas Leavitt
n:Leavitt;Thomas
org:Godmoma's Forge, LLC
adr:Suite B;;916 Soquel Ave.;Santa Cruz;CA;95062;United States
email;internet:[EMAIL PROTECTED]
title:Systems and Network Consultant
tel;fax:831-469-3382
tel;cell:831-295-3917
url:http://www.godmomasforge.com/
version:2.1
end:vcard



Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-10 Thread Marshall Eubanks



On Jan 10, 2007, at 11:19 PM, Thomas Leavitt wrote:

It seems to me that multi-cast is a technical solution for the  
bandwidth consumption problems precipitated by real-time Internet  
video broadcast, but it doesn't seem to me that the bulk of current  
(or even future) Internet video traffic is going to be amenable to  
distribution via multi-cast - or, at least, separate and apart from  
whatever happens with multi-cast, a huge and growing volume of  
video traffic will be flowing over the 'net...


I would fully agree with this.



I don't think consumers are going to accept having to wait for a  
scheduled broadcast of whatever piece of video content they want  
to view - at least if the alternative is being able to download and  
watch it nearly


That's the pull model. The push model will also exist. Both will make  
money.


immediately. That said, for the most popular content with the  
widest audience, scheduled multi-cast makes sense... especially  
when the alternative is waiting for a large download to finish -  
contrawise, it doesn't seem reasonable to be constantly multi- 
casting *every* piece of video content anyone might ever want to  
watch (that in itself would consume an insane amount of bandwidth).  
How many pieces of video content are there on YouTube? How many  
more can we expect to emerge over the next decade, given the ever  
decreasing cost of entry for reasonably decent video production?


Lots. Remember, of course, Sturgeon's law. But, lots. If you want  
numbers, 10^4 channels, billions of pieces of uncommercial content,  
and millions of pieces of commercial content.




All of which, to me, leaves the fundamental issue of how the  
upsurge in traffic is going to be handled left unresolved.




I think that technically, we have a pretty good idea how. I think  
that the real fundamental question is whose business models will  
allow them to make a profit from this upsurge.



Thomas



Regards
Marshall



Simon Lockhart wrote:
On Tue Jan 09, 2007 at 07:52:02AM +,  
[EMAIL PROTECTED] wrote:



Given that the broadcast model for streaming content
is so successful, why would you want to use the
Internet for it? What is the benefit?



How many channels can you get on your (terrestrial) broadcast  
receiver?


If you want more, your choices are satellite or cable. To get  
cable, you need to be in a cable area. To get satellite, you need  
to stick a dish on the side of your house, which you may not want  
to do, or may not be allowed

to do.

With IPTV, you just need a phoneline (and be close enough to the  
exchange/CO
to get decent xDSL rate). In the UK, I'm already delivering 40+  
channels over

IPTV (over inter-provider multicast, to any UK ISP that wants it).

Simon




--
Thomas Leavitt - [EMAIL PROTECTED] - 831-295-3917 (cell)

*** Independent Systems and Network Consultant, Santa Cruz, CA ***

thomas.vcf




A side-note Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread Scott Weeks



: ...My view on this subject is U.S.-centric...this 
: is NANOG, not AFNOG or EuroNOG or SANOG.

The 'internet' is generally boundary-less.  I would hope that one day our 
discussions will be likewise.  Otherwise, the forces of the boundary-creators 
will segment everthing we are working on and defend the borders they've created.

scott


RE: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread Brandon Butterworth

 Given that the broadcast model for streaming content
 is so successful, why would you want to use the
 Internet for it?

We now have to pay for spectrum, when you have to pay you look for the
cheapest delivery path.

Until we switch off analogue there is a shortage of spectrum so we have
limited channels and no room for much HD

All spectrum is for sale so someone can come in and buy it out from
underneath existing uses. Hence spectrum freed by analogue switch off
is less likely to be available for TV. As price is determined by the
highest bidder you can be priced out of carrying on doing what was fine
before.

People have content they want to deliver where the business model
requires cheaper delivery than traditional broadcast TV

brandon


Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread Peter Dambier


Gian Constantine wrote:
Well, yes. My view on this subject is U.S.-centric. In fairness to me, 
this is NANOG, not AFNOG or EuroNOG or SANOG.


I thought Québec and Mexico did belong to the North American Network too.

...



I agree there is a market for ethnic and niche content, but it is not 
the broad market many companies look for. The investment becomes much 
more of a gamble than marketing the latest and greatest (again debatable 
:-) ) to the larger market of...well...everyone.




There is only a minority in north america who happens to be white and
only some of them do speak english.


I remember the times when I could watch mexican tv transmitted from a
studio in florida.

Today everything is crypted on the sats. We have to use the internet
when we want someting special here in germany.

I guess Karin and me are not the only ones who do net even own a tv set.
The internet is the richer choice.

Even if it is mostly audio, video is nasty overseas, I am shure it does
make an impact in north america. Listening to my VoIP fone is mostly
impossible now at least overseas. I used to be able to fone overseas.
but even the landline has deteriorated because the fonecompanies have
switched to VoIP themselves.


Cheers
Peter and Karin

--
Peter and Karin Dambier
Cesidian Root - Radice Cesidiana
Rimbacher-Strasse 16
D-69509 Moerlenbach-Bonsweiher
+49(6209)795-816 (Telekom)
+49(6252)750-308 (VoIP: sipgate.de)
mail: [EMAIL PROTECTED]
mail: [EMAIL PROTECTED]
http://iason.site.voila.fr/
https://sourceforge.net/projects/iason/
http://www.cesidianroot.com/



Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread Michael . Dillon

 I remember the times when I could watch mexican tv transmitted from a
 studio in florida.

If it comes from a studio in Florida then it
is AMERICAN TV, not Mexican TV. I believe there
are three national TV networks in the USA, 
which are headquartered in Miami and which 
broadcast in Spanish.

--Michael Dillon



Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread Gian Constantine
I am not sure what I was thinking. Mr Bonomi was kind enough to point  
out a failed calculation for me. Obviously, a HD file would only be  
about 3.7GB for a 90 minute file at 5500kbps. In my haste, I  
neglected to convert bits to bytes. My apologies.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.
Office: 404-748-6207
Cell: 404-808-4651
Internal Ext: x22007
[EMAIL PROTECTED]



On Jan 8, 2007, at 9:07 PM, Gian Constantine wrote:

There may have been a disconnect on my part, or at least, a failure  
to disclose my position. I am looking at things from a provider  
standpoint, whether as an ISP or a strict video service provider.


I agree with you. From a consumer standpoint, a trickle or off-peak  
download model is the ideal low-impact solution to content  
delivery. And absolutely, a 500GB drive would almost be overkill on  
space for disposable content encoded in H.264. Excellent SD (480i)  
content can be achieved at ~1200 to 1500kbps, resulting in about a  
1GB file for a 90 minute title. HD is almost out of the question  
for internet download, given good 720p at ~5500kbps, resulting in a  
30GB file for a 90 minute title.


Service providers wishing to provide this service to their  
customers may see some success where they control the access medium  
(copper loop, coax, FTTH). Offering such a service to customers  
outside of this scope would prove very expensive, and likely, would  
never see a return on the investment without extensive peering  
arrangements. Even then, distribution rights would be very  
difficult to attain without very deep pockets and crippling revenue  
sharing. The studios really dislike the idea of transmission  
outside of a closed network. Don't forget. Even the titles you  
mentioned are still owned by very large companies interested in  
squeezing every possible dime from their assets. They would not be  
cheap to acquire.


Further, torrent-like distribution is a long long way away from  
sign off by the content providers. They see torrents as the number  
one tool of content piracy. This is a major reason I see the  
discussion of tripping upstream usage limits through content  
distribution as moot.


I am with you on the vision of massive content libraries at the  
fingertips of all, but I see many roadblocks in the way. And,  
almost none of them are technical in nature.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.
Office: 404-748-6207
Cell: 404-808-4651
Internal Ext: x22007
[EMAIL PROTECTED]



On Jan 8, 2007, at 7:51 PM, Bora Akyol wrote:



Please see my comments inline:


-Original Message-
From: Gian Constantine [mailto:[EMAIL PROTECTED]
Sent: Monday, January 08, 2007 4:27 PM
To: Bora Akyol
Cc: nanog@merit.edu
Subject: Re: Network end users to pull down 2 gigabytes a
day, continuously?


snip


I would also argue storage and distribution costs are not
asymptotically zero with scale. Well designed SANs are not
cheap. Well designed distribution systems are not cheap.
While price does decrease when scaled upwards, the cost of
such an operation remains hefty, and increases with additions
to the offered content library and a swelling of demand for
this content. I believe the graph becomes neither asymptotic,
nor anywhere near zero.


To the end user, there is no cost to downloading videos when they are
sleeping.
I would argue that other than sports (and some news) events, there is
pretty much no content that
needs to be real time. What the downloading (possibly 24x7) does  
is to
stress the ISP network to its max since the assumptions of  
statistical

multiplexing
goes out the window. Think of a Tivo that downloads content off the
Internet
24x7.

The user is still paying for only what they pay each month, and  
this is

network neutrality 2.0 all over again.



You are correct on the long tail nature of music. But music
is not consumed in a similar manner as TV and movies.
Television and movies involve a little more commitment and
attention. Music is more for the moment and the mood. There
is an immediacy with music consumption. Movies and television
require a slight degree more patience from the consumer. The
freshness (debatable :-) ) of new release movies and TV can
often command the required patience from the consumer. Older
content rarely has the same pull.


I would argue against your distinction between visual and auditory
content.
There is a lot of content out there that a lot of people watch and  
the

content
is 20-40+ years old. Think Brady Bunch, Bonanza, or archived games  
from

NFL,
MLB etc. What about Smurfs (for those of us with kids)?

This is only the beginning.

If I can get a 500GB box and download MP4 content, that's a lot of
essentially free storage.

Coming back to NANOG content, I think video (not streamed but  
multi-path

distributed video) is going to bring the networks down not by sheer
bandwidth alone but by challenging the assumptions behind the
engineering

Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread Leo Vegoda


On Jan 9, 2007, at 1:51 AM, Bora Akyol wrote:

[...]


I would argue that other than sports (and some news) events, there is
pretty much no content that needs to be real time.


I'm not sure I agree. I've noticed that almost any form of live TV,  
with the exception of news and sports programming, uses the benefit  
of real time transmission to allow audience interaction. For instance:


- Phone in discussion and quiz shows
- Any show with voting
- Video request shows

Not only does this type of programming require real-time  
distribution, as these shows are quite often cheaper to produce than  
pre-recorded entertainment or documentaries they tend to fill a large  
portion of the schedule. In some cases the show producers share  
revenue from the phone calls, too. That makes them more attractive to  
commissioning editors, I suspect.


Leo



Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread Joe Abley



On 8-Jan-2007, at 22:26, Gian Constantine wrote:

My contention is simple. The content providers will not allow P2P  
video as a legal commercial service anytime in the near future.  
Furthermore, most ISPs are going to side with the content providers  
on this one. Therefore, discussing it at this point in time is  
purely academic, or more so, diversionary.


There are some ISPs in North America who tell me that something like  
80% of their traffic *today* is BitTorrent. I don't know how accurate  
their numbers are, or whether those ISPs form a representative  
sample, but it certainly seems possible that the traffic exists  
regardless of the legality of the distribution.


If the traffic is real, and growing, the question is neither academic  
nor diversionary.


However, if we close our eyes and accept for a minute that P2P video  
isn't happening, and all growth in video over the Internet will be in  
real-time streaming, then I think the future looks a lot more scary.  
When TSN.CA streamed the World Junior Hockey Championship final via  
Akamai last Friday, there were several ISPs in Toronto who saw their  
transit traffic *double* during the game.



Joe



Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread Keith




We have looked at Amazon's S3 solution for storage since it is 
relatively cheap. But the transit costs from Amazon are quite expensive 
when it comes to moving media files at a large scale. At $0.20 per GB of 
data transferred, that would get extremely expensive. At Pando we move 
roughly 60 TB a day just from our super nodes. Amazon is cheap storage 
but expensive delivery on a large scale.


Keith O'Neill
Sr. Network Engineer
*Pando Networks*


Simon Lyall wrote:

On Mon, 8 Jan 2007, Gian Constantine wrote:
  

I would also argue storage and distribution costs are not
asymptotically zero with scale. Well designed SANs are not cheap.
Well designed distribution systems are not cheap. While price does
decrease when scaled upwards, the cost of such an operation remains
hefty, and increases with additions to the offered content library
and a swelling of demand for this content. I believe the graph
becomes neither asymptotic, nor anywhere near zero.



Lets see what I can do using today's technology:

According to the itunes website they have over 3.5 million songs. Lets
call it 4 million. Assume a decent bit rate and make them average 10 MB
each. That's 40 TB which would cost me $6k per month to store on Amazon
S3. Lets assume we use Amazon EC3 to only allow torrents of the files to
be downloaded and we transfer each file twice per month. Total cost around
$20k per month or $250k per year. Add $10k to pay somebody to create the
interface and put up a few banner ads and it'll be self supporting.

That sort of setup could come out of petty cash for larger ISPs marketing
Departments.

Of course there are a few problems with the above business model (mostly
legal) but infrastructure costs are not one of them. Plug in your own
numbers for movies and tv shows but 40 TB for each will probably be enough.

  




Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread Gian Constantine
Those numbers are reasonably accurate for some networks at certain  
times. There is often a back and forth between BitTorrent and NNTP  
traffic. Many ISPs regulate BitTorrent traffic for this very reason.  
Massive increases in this type of traffic would not be looked upon  
favorably.


If you considered my previous posts, you would know I agree streaming  
is scary on a large scale, but unicast streaming is what I reference.  
Multicast streaming is the real solution. Ultimately, a global  
multicast network is the only way to deliver these services to a  
large market.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.
Office: 404-748-6207
Cell: 404-808-4651
Internal Ext: x22007
[EMAIL PROTECTED]



On Jan 9, 2007, at 11:01 AM, Joe Abley wrote:




On 8-Jan-2007, at 22:26, Gian Constantine wrote:

My contention is simple. The content providers will not allow P2P  
video as a legal commercial service anytime in the near future.  
Furthermore, most ISPs are going to side with the content  
providers on this one. Therefore, discussing it at this point in  
time is purely academic, or more so, diversionary.


There are some ISPs in North America who tell me that something  
like 80% of their traffic *today* is BitTorrent. I don't know how  
accurate their numbers are, or whether those ISPs form a  
representative sample, but it certainly seems possible that the  
traffic exists regardless of the legality of the distribution.


If the traffic is real, and growing, the question is neither  
academic nor diversionary.


However, if we close our eyes and accept for a minute that P2P  
video isn't happening, and all growth in video over the Internet  
will be in real-time streaming, then I think the future looks a lot  
more scary. When TSN.CA streamed the World Junior Hockey  
Championship final via Akamai last Friday, there were several ISPs  
in Toronto who saw their transit traffic *double* during the game.



Joe





Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread Joe Abley



On 9-Jan-2007, at 11:29, Gian Constantine wrote:

Those numbers are reasonably accurate for some networks at certain  
times. There is often a back and forth between BitTorrent and NNTP  
traffic. Many ISPs regulate BitTorrent traffic for this very  
reason. Massive increases in this type of traffic would not be  
looked upon favorably.


The act of regulating p2p traffic is a bit like playing whack-a-mole.  
At what point does it cost more to play that game than it costs to  
build out to carry the traffic?


If you considered my previous posts, you would know I agree  
streaming is scary on a large scale, but unicast streaming is what  
I reference. Multicast streaming is the real solution. Ultimately,  
a global multicast network is the only way to deliver these  
services to a large market.


The trouble with IP multicast is that it doesn't exist, in a wide- 
scale, deployed, inter-provider sense.



Joe



Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread Sean Donelan


On Tue, 9 Jan 2007, Gian Constantine wrote:
Those numbers are reasonably accurate for some networks at certain times. 
There is often a back and forth between BitTorrent and NNTP traffic. Many 
ISPs regulate BitTorrent traffic for this very reason. Massive increases in 
this type of traffic would not be looked upon favorably.


If you considered my previous posts, you would know I agree streaming is 
scary on a large scale, but unicast streaming is what I reference. Multicast 
streaming is the real solution. Ultimately, a global multicast network is the 
only way to deliver these services to a large market.


Which is why ISPs will see all of the above.  There will be 
store-and-forward video, streaming video, on demand video, real-time 
interactive video, and probably 10 other types I can't think of.


The concern for university or ISP networks isn't that some traffic uses 
70% of their network, its that 5% of the users is using 70%, 80%, 90%, 
100% of their network regardless of what that traffic is. It isn't 
background traffic using excess capacity, it peaks at the same time

as other peak traffic times.  P2P congestion isn't constrained to a
single transit bottleneck, it causes bottlenecks in every path local
and transit.  Local congestion is often more of a concern than transit.

The big question is whether the 5% of the users will continue to pay
for 5% of the network, or if they use 70% of the network will they pay
for 70% of the network?  Will 95% of the users see their prices fall 
and 5% of the users see their prices rise?




RE: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread Bora Akyol

 

 -Original Message-
 From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On 
 Behalf Of Gian Constantine
 Sent: Monday, January 08, 2007 7:27 PM
 To: Thomas Leavitt
 Cc: nanog@merit.edu
 Subject: Re: Network end users to pull down 2 gigabytes a 
 day, continuously?
 
 My contention is simple. The content providers will not allow 
 P2P video as a legal commercial service anytime in the near 
 future. Furthermore, most ISPs are going to side with the 
 content providers on this one. Therefore, discussing it at 
 this point in time is purely academic, or more so, diversionary.
 

I don't think they have a choice really. The state of the art in
application aware QoS/rate shaping is so behind the times that by the
time it caught, the application would have changed.


Bora



Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread Gian Constantine
You are correct. Today, IP multicast is limited to a few small closed  
networks. If we ever migrate to IPv6, this would instantly change.  
One of my previous assertions was the possibility of streaming video  
as the major motivator of IPv6 migration. Without it, video streaming  
to a large market, outside of multicasting in a closed network, is  
not scalable, and therefore, not feasible. Unicast streaming is a  
short-term bandwidth-hogging solution without a future at high take  
rates.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.
Office: 404-748-6207
Cell: 404-808-4651
Internal Ext: x22007
[EMAIL PROTECTED]



On Jan 9, 2007, at 11:47 AM, Joe Abley wrote:



On 9-Jan-2007, at 11:29, Gian Constantine wrote:

Those numbers are reasonably accurate for some networks at certain  
times. There is often a back and forth between BitTorrent and NNTP  
traffic. Many ISPs regulate BitTorrent traffic for this very  
reason. Massive increases in this type of traffic would not be  
looked upon favorably.


The act of regulating p2p traffic is a bit like playing whack-a- 
mole. At what point does it cost more to play that game than it  
costs to build out to carry the traffic?


If you considered my previous posts, you would know I agree  
streaming is scary on a large scale, but unicast streaming is what  
I reference. Multicast streaming is the real solution. Ultimately,  
a global multicast network is the only way to deliver these  
services to a large market.


The trouble with IP multicast is that it doesn't exist, in a wide- 
scale, deployed, inter-provider sense.



Joe





Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread Joe Abley



On 9-Jan-2007, at 13:04, Gian Constantine wrote:

You are correct. Today, IP multicast is limited to a few small  
closed networks. If we ever migrate to IPv6, this would instantly  
change. One of my previous assertions was the possibility of  
streaming video as the major motivator of IPv6 migration. Without  
it, video streaming to a large market, outside of multicasting in a  
closed network, is not scalable, and therefore, not feasible.  
Unicast streaming is a short-term bandwidth-hogging solution  
without a future at high take rates.


So you are of the opinion that inter-domain multicast doesn't exist  
today for technical reasons, and those technical reasons are fixed in  
IPv6?



Joe



Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread Marshall Eubanks



On Jan 9, 2007, at 1:04 PM, Gian Constantine wrote:

You are correct. Today, IP multicast is limited to a few small  
closed networks. If we ever migrate to IPv6, this would instantly  
change.


I am curious. Why do you think that ?

Regards
Marshall

One of my previous assertions was the possibility of streaming  
video as the major motivator of IPv6 migration. Without it, video  
streaming to a large market, outside of multicasting in a closed  
network, is not scalable, and therefore, not feasible. Unicast  
streaming is a short-term bandwidth-hogging solution without a  
future at high take rates.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.
Office: 404-748-6207
Cell: 404-808-4651
Internal Ext: x22007
[EMAIL PROTECTED]



On Jan 9, 2007, at 11:47 AM, Joe Abley wrote:



On 9-Jan-2007, at 11:29, Gian Constantine wrote:

Those numbers are reasonably accurate for some networks at  
certain times. There is often a back and forth between BitTorrent  
and NNTP traffic. Many ISPs regulate BitTorrent traffic for this  
very reason. Massive increases in this type of traffic would not  
be looked upon favorably.


The act of regulating p2p traffic is a bit like playing whack-a- 
mole. At what point does it cost more to play that game than it  
costs to build out to carry the traffic?


If you considered my previous posts, you would know I agree  
streaming is scary on a large scale, but unicast streaming is  
what I reference. Multicast streaming is the real solution.  
Ultimately, a global multicast network is the only way to deliver  
these services to a large market.


The trouble with IP multicast is that it doesn't exist, in a wide- 
scale, deployed, inter-provider sense.



Joe







Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread Gian Constantine
The available address space for multicast in IPv4 is limited. IPv6  
vastly expands this space. And here, I may have been guilty of  
putting the cart before the horse. Inter-AS multicast does not exist  
today because the motivators are not there. It is absolutely  
possible, but providers have to want to do it. Consumers need to see  
some benefit from it. Again, the benefit needs to be seen by a large  
market. Providers make decisions in the interest of their bottom  
line. A niche service is not a motivator for inter-AS multicast. If  
demand for variety in service provider selection grows with the  
proliferation of IPTV, we may see the required motivation for inter- 
AS multicast, which places us in a position moving to the large  
multicast space available in IPv6.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.


On Jan 9, 2007, at 1:09 PM, Joe Abley wrote:



On 9-Jan-2007, at 13:04, Gian Constantine wrote:

You are correct. Today, IP multicast is limited to a few small  
closed networks. If we ever migrate to IPv6, this would instantly  
change. One of my previous assertions was the possibility of  
streaming video as the major motivator of IPv6 migration. Without  
it, video streaming to a large market, outside of multicasting in  
a closed network, is not scalable, and therefore, not feasible.  
Unicast streaming is a short-term bandwidth-hogging solution  
without a future at high take rates.


So you are of the opinion that inter-domain multicast doesn't exist  
today for technical reasons, and those technical reasons are fixed  
in IPv6?



Joe





Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread Gian Constantine
This is a little presumptuous on my part, but what other reason would  
motivate a migration to IPv6. I fail to see us running out of unicast  
addresses any time soon. I have been hearing IPv6 is coming for many  
years now. I think video service is really the only motivation for  
migrating.


I am wrong on plenty of things. This may very well be one of them. :-)

Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.

On Jan 9, 2007, at 1:21 PM, Marshall Eubanks wrote:




On Jan 9, 2007, at 1:04 PM, Gian Constantine wrote:

You are correct. Today, IP multicast is limited to a few small  
closed networks. If we ever migrate to IPv6, this would instantly  
change.


I am curious. Why do you think that ?

Regards
Marshall

One of my previous assertions was the possibility of streaming  
video as the major motivator of IPv6 migration. Without it, video  
streaming to a large market, outside of multicasting in a closed  
network, is not scalable, and therefore, not feasible. Unicast  
streaming is a short-term bandwidth-hogging solution without a  
future at high take rates.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.
Office: 404-748-6207
Cell: 404-808-4651
Internal Ext: x22007
[EMAIL PROTECTED]



On Jan 9, 2007, at 11:47 AM, Joe Abley wrote:



On 9-Jan-2007, at 11:29, Gian Constantine wrote:

Those numbers are reasonably accurate for some networks at  
certain times. There is often a back and forth between  
BitTorrent and NNTP traffic. Many ISPs regulate BitTorrent  
traffic for this very reason. Massive increases in this type of  
traffic would not be looked upon favorably.


The act of regulating p2p traffic is a bit like playing whack-a- 
mole. At what point does it cost more to play that game than it  
costs to build out to carry the traffic?


If you considered my previous posts, you would know I agree  
streaming is scary on a large scale, but unicast streaming is  
what I reference. Multicast streaming is the real solution.  
Ultimately, a global multicast network is the only way to  
deliver these services to a large market.


The trouble with IP multicast is that it doesn't exist, in a wide- 
scale, deployed, inter-provider sense.



Joe









Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread John Kristoff

On Tue, 9 Jan 2007 13:21:38 -0500
Marshall Eubanks [EMAIL PROTECTED] wrote:

  You are correct. Today, IP multicast is limited to a few small  
  closed networks. If we ever migrate to IPv6, this would instantly  
  change.
 
 I am curious. Why do you think that ?

I could have said the same thing, but with an opposite end meaning.
You take one 10+ year technology with minimal deployment and put it
on top of another 10+ year technology also far from being widely
deployed and you end up with something quickly approaching zero
deployment, instantly.  :-)

John


Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread Fergie

-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

- -- Gian Constantine [EMAIL PROTECTED] wrote:

The available address space for multicast in IPv4 is limited. IPv6 vastly
expands this space. And here, I may have been guilty of putting the cart
before the horse. Inter-AS multicast does not exist today because the
motivators are not there. It is absolutely possible, but providers have to
want to do it. Consumers need to see some benefit from it. Again, the
benefit needs to be seen by a large market. Providers make decisions in
the interest of their bottom line. A niche service is not a motivator for
inter-AS multicast. If demand for variety in service provider selection
grows with the proliferation of IPTV, we may see the required motivation
for inter-AS multicast, which places us in a position moving to the large
multicast space available in IPv6.  


I don't think I'd be hanging my hat on IPv6 operational frobs at
this moment in time.

But that's just me. :-)

$.02,

- - ferg

-BEGIN PGP SIGNATURE-
Version: PGP Desktop 9.5.2 (Build 4075)

wj8DBQFFo+oXq1pz9mNUZTMRAuSaAJ47tTGFI+kTaZwOO2D6CHOWmIn6eACgyZzd
xy6wZ7sFYsU3jeU2a3XIBq4=
=aRhp
-END PGP SIGNATURE-


--
Fergie, a.k.a. Paul Ferguson
 Engineering Architecture for the Internet
 fergdawg(at)netzero.net
 ferg's tech blog: http://fergdawg.blogspot.com/




Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread Gian Constantine

Fair enough. :-)

Nearly everything has a time and place, though.

Pretty much everything on this thread is speculative.

Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.
Office: 404-748-6207
Cell: 404-808-4651
Internal Ext: x22007
[EMAIL PROTECTED]



On Jan 9, 2007, at 2:13 PM, John Kristoff wrote:



On Tue, 9 Jan 2007 13:21:38 -0500
Marshall Eubanks [EMAIL PROTECTED] wrote:


You are correct. Today, IP multicast is limited to a few small
closed networks. If we ever migrate to IPv6, this would instantly
change.


I am curious. Why do you think that ?


I could have said the same thing, but with an opposite end meaning.
You take one 10+ year technology with minimal deployment and put it
on top of another 10+ year technology also far from being widely
deployed and you end up with something quickly approaching zero
deployment, instantly.  :-)

John




Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread Douglas Otis



On Jan 9, 2007, at 7:17 PM, Fergie wrote:

Gian Constantine [EMAIL PROTECTED] wrote:


If demand for variety in service provider selection grows with the  
proliferation of IPTV, we may see the required motivation for  
inter-AS multicast, which places us in a position moving to the  
large multicast space available in IPv6.


I don't think I'd be hanging my hat on IPv6 operational frobs at  
this moment in time.


This might be sooner than you think.  Microsoft has already begun  
introduction of PNRP.  This is a peer-to-peer distribution technology  
for encapsulating IPv6 within IPv4.  It also works through IPv6  
gateways (not included by Microsoft).  The frobs would be any Vista  
or XP box running this protocol allowing a new type of multicast to  
exist (a proprietary one at that).  Perhaps that might explain the  
non-partisan computing bill-boards. : )


Singapore is restricting bandwidth on bittorrent, so one might wonder  
whether the same response is possible with this technology should it  
prove problematic.  With many announcing the onset of Web 3.0, 10  
mbit connectivity would suggest sustained data rates at this level  
should not be a problem.  Photons are cheaper than physical media.   
The question might be whether Ethernet can handle media delivered on- 
demand over IP.


-Doug



Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread Simon Lockhart

On Tue Jan 09, 2007 at 07:52:02AM +, [EMAIL PROTECTED] wrote:
 Given that the broadcast model for streaming content
 is so successful, why would you want to use the
 Internet for it? What is the benefit?

How many channels can you get on your (terrestrial) broadcast receiver?

If you want more, your choices are satellite or cable. To get cable, you 
need to be in a cable area. To get satellite, you need to stick a dish on 
the side of your house, which you may not want to do, or may not be allowed
to do.

With IPTV, you just need a phoneline (and be close enough to the exchange/CO
to get decent xDSL rate). In the UK, I'm already delivering 40+ channels over
IPTV (over inter-provider multicast, to any UK ISP that wants it).

Simon


Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread Simon Lockhart

On Mon Jan 08, 2007 at 10:26:30PM -0500, Gian Constantine wrote:
 My contention is simple. The content providers will not allow P2P  
 video as a legal commercial service anytime in the near future.  

 Furthermore, most ISPs are going to side with the content providers  
 on this one. Therefore, discussing it at this point in time is purely  
 academic, or more so, diversionary.

In my experience, content providers want to use P2P because it reduces 
their distribution costs (in quotes, because I'm not convinced it does, in
the real world). Content providers don't care whether access providers like 
P2P or not, just whether it works or not.

On one hand, access providers are putting in place rate limiting or blocking
of P2P (subject to discussions of how effective those are), but on the other
hand, content providers are saying that P2P is the future...

Simon


Re: A side-note Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread Simon Lockhart

On Tue Jan 09, 2007 at 12:17:56AM -0800, Scott Weeks wrote:
 : ...My view on this subject is U.S.-centric...this 
 : is NANOG, not AFNOG or EuroNOG or SANOG.
 
 The 'internet' is generally boundary-less.  I would hope that one day our
 discussions will be likewise.  Otherwise, the forces of the boundary-creators
 will segment everthing we are working on and defend the borders they've
 created.

Unfortunately, content rights owners don't understand this. All they 
understand is that they sell their content in USA, and then the sell it 
again in UK, and then again in France, and again in China, etc. What they
don't want is to sell it once, in the USA, say, and not be able to sell it 
again because it's suddenly available everywhere.

Simon


RE: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread Bora Akyol

Simon

An additional point to consider is that it takes a lot of effort and
 to get a channel allocated to your content in a cable network. 

This is much easier when TV is being distributed over the Internet.
 

 -Original Message-
 From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On 
 Behalf Of Simon Lockhart
 Sent: Tuesday, January 09, 2007 2:42 PM
 To: [EMAIL PROTECTED]
 Cc: nanog@merit.edu
 Subject: Re: Network end users to pull down 2 gigabytes a 
 day, continuously?
 
 
 On Tue Jan 09, 2007 at 07:52:02AM +, 
 [EMAIL PROTECTED] wrote:
  Given that the broadcast model for streaming content
  is so successful, why would you want to use the
  Internet for it? What is the benefit?
 
 How many channels can you get on your (terrestrial) broadcast 
 receiver?
 
 If you want more, your choices are satellite or cable. To get 
 cable, you 
 need to be in a cable area. To get satellite, you need to 
 stick a dish on 
 the side of your house, which you may not want to do, or may 
 not be allowed
 to do.
 
 With IPTV, you just need a phoneline (and be close enough to 
 the exchange/CO
 to get decent xDSL rate). In the UK, I'm already delivering 
 40+ channels over
 IPTV (over inter-provider multicast, to any UK ISP that wants it).
 
 Simon
 
 



  1   2   >