Re: internet infrastructure investment data

2003-10-08 Thread Kendall Clark
On Tue, Oct 07, 2003 at 08:33:52PM -0700, Doyle Saylor wrote:

 Doyle,
 Couple of things, while for you the term moron is simply a label that
 indicates you think Shirky is not interesting, for me as a disability rights
 advocate I find the term anti-disabled.  If you read Stephen Jay Gould's
 book on The Mismeasure of Man you get a decent insight on this made-up
 word.  The basic concept from the early nineteen hundreds in the IQ
 'science' underlying the word moron was a person too stupid to learn how to
 read.  The science behind the concept was dismantled by Gould.  So the term
 moron while associated in the public mind with developmentally disabled
 persons is simply empty of meaning because it has not scientific validity.

Ah, yes, thanks Doyle. Your point is well taken, and I apologize for loosely
throwing around that term. I did, in fact, mean precisely what you said: I
simply don't find Shirky interesting or convincing. He's probably a really
smart person, I just don't care for his work.

Thanks for the gentle correction.

 As to your personal insight into Shirky, I always thought Bush was not
 intellectually able, but I don't dwell on labeling him stupid because that
 is an empty way of trying to understand what is going on.  Just a brief
 reaction to your wording about Shirky.

Noted and accepted. (And I really meant my comments to be taken as my
opinion of the value of his *work*, not as any sort of insight into *him*.)

 I would say though you can't argue that investment in the telecom industry
 is what made things scale up to 5B + documents, if people didn't use the
 internet as well, it was after all for a couple of decades just a back water
 in the sciences community.

My point was that infrastructure investment is systematically
under-estimated, among the technical crowd I write for regularly, as *part*
of the overall explanation. I didn't mean to imply that such investment was
alone a necessary or sufficient condition. Sorry if that wasn't clear.

 If you are meaning 5B+ (billion plus) documents
 I am struck by this statistic that there are roughly one document on the web
 to every five hundred documents in private intranet resources.  So I think
 about these things in terms of public and private intellectual property.

Well, sure, and that's an interesting way to think about them, just not one
that I was working toward in this context. While my publisher will let me do
a bit of politech, it's a very short leash, and this really is a practical
programming book.

 support for the web.  So you are downgrading the intellectual labor process
 that goes into the web by dwelling on the machinery behind it.  Maybe that
 isn't your intent, but strikes me that way.

I'm surprised that you read anything I wrote to mean *that*. There are lots
of books which explain to programmers how to write software for running on
the Web. That seems a perfectly reasonable kind of book to write.

I'm writing one such book. It strikes me as relevant to dwell on the
machinery behind it, since that is what the book is about. I have no
interest in downgrading the intellectual labor process that goes into the
web, nor do I think I've done that. I've been part of that process since
1995, so it would be an odd thing for me to downgrade.

 me,
 This reads to me like you have a thing about the W3C (world wide web
 consortium) being over blown in value.  And the machinery and spending on
 the infrastructure as much more important.

In point of fact, the HTTP protocol is a product of the IETF, not the W3C. I
think that the W3C is a very peculiar institution, and I've written about it
a lot. I'd be perfectly happy to discuss those issues with you. My throwaway
comment about Berners-Lee was simply meant to suggest, as I've done on LBO
before, that I think he's overrated.

 'ideas' across.  Even if I think you are off the beam I get a lot out of a
 capable person writing in depth including having a historical sense of time
 and place.

That's a fair and good suggestion. Again, I'm not sure I can squeeze that
into *this* book, but this is the sort of thing I do regularly in my weekly
columns, for what it's worth.

 I hope I gave you some value for your request for advice.  I was trying to
 be helpful.

I appreciate and recognize that.

Thanks,
Kendall Clark


Re: internet infrastructure investment data

2003-10-08 Thread ravi
Kendall Clark wrote:
 On Tue, Oct 07, 2003 at 10:16:51PM -0400, ravi wrote:

 But there is an idea floating around geekdom that the Web works
 (in the sense that it scales 5B+ documents, something which no
 one really expected) because of various purely technological
 ideas...

 i could use some clarification of the statement above. does it mean
  that in geekdom there is an idea that the web works *only*
 because of technological ideas? if not, then the claim is a truism
  isnt it?

 No, it means that many technical people believe the Web *still* works
  at the present scale because of some specific changes that were made
  to the HTTP protocol. That is, these folks give no credence to the
 alternative explanation that, even w/out those specific technical
 changes, the Web would work at the present scale because of massive
 infrastructural investment...


could you point me to some sources? i find it very surprising that
technical people believe that changes to HTTP can be the sole cause of
performance gains (especially given that caching, which indeed does, at
great cost, distribute load, was mostly possible with early HTTP
versions, and further modifications of HTTP were aimed, in a large
sense, at addressing some of the technical defeciencies of a protocol
designed by a non-protocols person, eg: persistent connections).

almost all the technical people i know will readily point to the
increase in network bandwidth (due to the excessive deployment by telcos
in the boom years), the large drop in disk/memory/cpu prices, etc as
significant (perhaps even larger causes) for the gains in scaling. they
would also not find these gain surprising at all. i do not. i do not
find it surprising at all that the internet has scaled to the modest
level it currently is at. i *would* be surprised if we were doing
real-time video over the internet (at the scale of current radio/tv
broadcasting), but that's another beast.



 1. what is the web? is it the internet + the various web servers
  and documents that they serve?

 That's a good question. I mean in this case it's that part of the
 Internet which happens via HTTP, server  client. It's a significant
  percentage of total Internet usage.


so, you are talking about the web component of internet usage? and by
that i assume you mean network usage i.e., available bandwidth and
throughput on routers and other intermediate devices?


 2. what does scaling to 5b+ documents mean? 5b+ html files stored
 somewhere on networked computers? 5b+ documents transmitted in
 parallel (i.e., capacity)?

 Neither, actually. There are something like 5B+ addressable resources
  (things which have URIs), but they aren't all HTML files on
 servers, many of them are resources which are computed on-the-fly.
 And I doubt anyone believes that all of these resources could be
 simultaneously requested.


i use HTML files as shorthand, but if we are talking about dynamic
content, are you including server load and performance? (the previous
point suggests otherwise).

i wouldn't be surprised actually if there comes a day in the near
future, when 5 billion documents are in transit simultaneously on the
internet. i would guess (admittedly a very rough guess) that that number
is already in the millions right now.

and as more and more people blindly adopt HTTP as their transport
protocol (simply because of such technologies as web services), often
ignoring the years of work done with protocol design, many of these
documents will be transported over HTTP (though it might not be the
right transport at all) -- to take the rant a bit further about the
ascendancy of buzzword compliance since the  corporatization of the
net and the IETF, the current craze with XML was well ridiculed by a
recent RFC (an internet technical specification document) which parodied
this trend by putting forward a proposal for IP over XML!!

http://www.faqs.org/rfcs/rfc3252.html



 what does default libertarian geek mind mean? that by default
 you assume all geeks are libertarians? or that you have found
 them to be so?

 I mean that the dominant ideology among the geek set (well, large
 chunks of it anyway, it's probably not more monolithic than any other
  subculture) is strong right libertarian, especially on the issue of
 where technology comes from. It's *not* a David Noble-friendly part
 of the world, at least as I have experienced it. (And, yes, I do tend
  to assume that most geeks are right libertarians, given the dominant
  ideology, but it's a loose assumption which I stand ready to modify.
  Anyway, not sure how this is relevant...)


i am not sure how this is relevant either, but hey, i didnt mention it
;-). you must have thought it relevant, otherwise why would you mention
it? ;-) and as a geek, of course i take offense! seriously however, all
the geeks i know are somewhat of a mix of humanitarian or analytical
leftist. of course we might differ on what we consider a geek. perhaps
this is a west vs east coast 

Re: internet infrastructure investment data

2003-10-08 Thread ravi
ravi wrote:
 snip

I mean that the dominant ideology among the geek set (well, large
chunks of it anyway, it's probably not more monolithic than any other
 subculture) is strong right libertarian, especially on the issue of
where technology comes from. It's *not* a David Noble-friendly part
of the world, at least as I have experienced it. (And, yes, I do tend
 to assume that most geeks are right libertarians, given the dominant
 ideology, but it's a loose assumption which I stand ready to modify.
 Anyway, not sure how this is relevant...)

 i am not sure how this is relevant either, but hey, i didnt mention it
 ;-). you must have thought it relevant, otherwise why would you mention


don't know why that got cut off, but here's the rest of my message:

i am not sure how this is relevant either, but hey, i didnt mention it
. you must have thought it relevant, otherwise why would you mention
it?  and as a geek, of course i take offense! seriously however, all
the geeks i know are somewhat of a mix of humanitarian or analytical
leftist. of course we might differ on what we consider a geek. perhaps
this is a west vs east coast thing?

the IETF (or perhaps the IAB or IESG, i forget who authored the
document) for instance suggests that it is neither a dictatorship nor a
democracy, but that it works by technical consensus (if you believe
some) or as a meritocracy (in the words of others). in the words of dave
clark: we reject kings, presidents, and voting -- we believe in running
code! would you call that a libertarian viewpoint?



 Sorry, but I wouldn't dream of asking an actual computer
 technical question on PEL-L or LBO. :

 why not?

 Because it's completely off-topic? Isn't that obvious?



its obvious that its off-topic, but its not obvious (at least to me)
that that's why you wrote the above. michael has been quite lenient
towards computer tech questions on this list and people have asked them,
and some have even got answers!


 I've already explained it, so I won't do so again. I'm not gonna go
 'round and 'round about this, Ravi, since it's not really germane to
 my question. I'm starting to regret including any surrounding
 context.


you have to realize that i ask these questions because:

1. what you specified as the context was not clear to me. it still is
not (and probably because i am not reading you right).

2. i am surprised by your generalizations about the geek and computer
science community. perhaps what you mean by geek is the high-school geek
set while what i mean is the hacker crowd (for the general audience:
'hacker' does not mean what the media has wrongly used the term to
represent i.e., someone who breaks into computers). i have lived among
the hacker and computer science community for 15 years now (including a
long stint at one of the temples: bell labs) and your statements do not
match my experiences very well. if that is because i have misunderstood
my community, then i would appreciate any clarifications that disabuse me.

while these might be peripheral to your main question, once you put
these opinions out in a public venue, i think discussion on them is
valid. of course, if michael thinks we should go off-list, i will gladly
do so.

--ravi


Re: internet infrastructure investment data

2003-10-08 Thread Kendall Clark
On Wed, Oct 08, 2003 at 11:54:43AM -0400, ravi wrote:

 could you point me to some sources? i find it very surprising that
 technical people believe that changes to HTTP can be the sole cause of
 performance gains (especially given that caching, which indeed does, at
 great cost, distribute load, was mostly possible with early HTTP
 versions, and further modifications of HTTP were aimed, in a large
 sense, at addressing some of the technical defeciencies of a protocol
 designed by a non-protocols person, eg: persistent connections).

I've never said *sole* cause, which may be part of the problem. I think
there are many engineers (probably 'computer scientists' was a bit over the
top earlier) who believe it was a *crucial* cause. I don't know anyone who
claims it's the *sole* cause.

 so, you are talking about the web component of internet usage? and by
 that i assume you mean network usage i.e., available bandwidth and
 throughput on routers and other intermediate devices?

Yes, each time I've specified the kind of infrastructural investment I'm
interested in quantifying, I've specifically mentioned router  bandwidth
advances and capacity investments. As you know, the Web part of the internet
is still TCP/IP traffic.

 and as more and more people blindly adopt HTTP as their transport
 protocol (simply because of such technologies as web services),

Well, part of this argument is about the way SOAP breaks the caching
benefits of HTTP 1.1 (overuse of POST over GET, for example), so, yeah,
that's part of the issue. I would probably quibble with the blindly adopt
bit; I think it's more that people are misusing it rather than reflexively
using it when something else entirely should be used.

The utter market failure of BEEP suggests that HTTP has a lot of general
purpose life left in it.

  Because it's completely off-topic? Isn't that obvious?

 its obvious that its off-topic, but its not obvious (at least to me)
 that that's why you wrote the above.

That's why I wrote the above. :

 2. i am surprised by your generalizations about the geek and computer
 science community. perhaps what you mean by geek is the high-school geek
 set while what i mean is the hacker crowd (for the general audience:
 'hacker' does not mean what the media has wrongly used the term to
 represent i.e., someone who breaks into computers).

No, I mean hackers. Obviously it's not a monolithic set of attitudes 
beliefs. There are obviously pockets of leftie hackers and geeks. But I
still stand by my claim that the dominant ideology is right libertarian. I'm
thinking of the Slashdot crowd, Eric Raymond and his hangers-on, and the
like. Obvious counterpoints include Richard Stallman, the IMC hacker crowd,
many anarchist groups who actively use Web tech, and so on.

 match my experiences very well. if that is because i have misunderstood
 my community, then i would appreciate any clarifications that disabuse me.

I'm not trying to convince you that you've misread your experience. Hell,
I'm jealous that the parts of the hacker world you've interacted with have
not been right libertarian. The parts I have run into frequently have been
and I still tend to think that it's it forms the dominant ideology (which is
different than saying that everyone who is a hacker is a right libertarian.
That's clearly wrong).

 while these might be peripheral to your main question, once you put
 these opinions out in a public venue, i think discussion on them is
 valid.

Of course the discussion is valid, if Michael doesn't object. That doesn't
mean I'm interested in pursuing it. :

Kendall


Re: internet infrastructure investment data

2003-10-08 Thread joanna bujes


No, I mean hackers. Obviously it's not a monolithic set of attitudes 
beliefs. There are obviously pockets of leftie hackers and geeks. But I
still stand by my claim that the dominant ideology is right libertarian. I'm
thinking of the Slashdot crowd, Eric Raymond and his hangers-on, and the
like. Obvious counterpoints include Richard Stallman, the IMC hacker crowd,
many anarchist groups who actively use Web tech, and so on.

I have been working in computing (Tandem/Apple/Sun) for 20 years, and I
would say that though there are a lot of libertarians, they seem to me
to be pretty even split between the right and the left. There are also a
fair amount of socialists. Then I would say that the current and
continuing outsourcing of techhies to India and China is likely to
polarize this group even further.
(I thought HTTP was big because it could get you through fire walls, but
ravi, please correct me if I'm wrong. Oh, and that IP over XML was
hillarious.)
Joanna


Re: internet infrastructure investment data

2003-10-08 Thread ravi
joanna bujes wrote:

 (I thought HTTP was big because it could get you through fire walls,
 but ravi, please correct me if I'm wrong.


no, you are quite right -- HTTP is/was used as a fallback transport for
various applications (such as audio/video streaming), even though it was
not well-suited for them, because, as you suggest, firewall
administrators permitted HTTP into the intranet.

i was referring to the additional effect of these extremely abstracted
web based network solutions. many of these are quite heavy duty network
applications but, imho, in their object oriented/over-abstracted design,
they carry the blackbox model of the protocol stack too far. protocols
can and should be fine-tuned to particular applications (i admit i am
being a little vague here).

i use transport protocol in a loose sense above. HTTP is not really a
transport protocol -- its an application protocol. perhaps i should not
make this loose reference, since this is exactly what i am complaining
against: the use of HTTP as a transport protocol for all applications.
i.e., HTTP as the default and only application layer protocol -- whether
it is ready to perform such an important role is questionable (and afaik
has not been determined. one could even argue that the opposite, i.e.,
HTTP is a poorly designed application protocol, has been shown to be a
valid conclusion).


 Oh, and that IP over XML was hillarious.)


glad you liked it!

--ravi


Re: internet infrastructure investment data

2003-10-08 Thread Doyle Saylor
Greetings Pen-'Ellers,
Well KGC's response was just fine.  No need to pursue anything in my view,
however, I found some nuggets or tidbits of Telecom stuff here and there in
my notes so I'll pass it along assuming that it might find some interest for
KGC.

Tidbits about Telecoms from here and there which includes some dollar
figures here and there as well as other comparisons,

Gridlock on the superhighway
Dec 12th 2002
From The Economist print edition
...In America, the telecoms bust of 2000 has wiped out some 500,000 jobs
and $2 trillion in (apparent) stockmarket value.

...But the main source of the problem, we argued, was that most of the
newcomers (called competitive local exchange carriers, or CLECs, in
America) had simply failed to do their homework.

In particular, the DSL (digital subscriber line) technology that most of
them adopted was singularly inappropriate for the task. Apart from causing
interference problems, the 2B1Q algorithm used in America (and the 4B3T
line code used in Europe) to transmit digital signals along a pair of copper
telephone lines stumbles badly over bridge taps where the wires get
spliced.

...Some readers believed that the CLECs' choice of technology was not
entirely arbitrary. Part of the reason, suggested one insider, was that
most of the CLECs were dependent on 'vendor financing' from the makers of
the older line codes-and, as such, were locked into purchases of inferior
equipment.

from Pen-L, December 6, Nomi writes in response to a Paul Krugman article,

...Krugman
Bad metaphors make bad policy. Everyone talks about the information
highway. But in economic terms the telecommunications network resembles not
a highway but the railroad industry of the robber-baron era - that is,
before it faced effective competition from trucking. And railroads
eventually faced tough regulation, for good reason: they had a lot of market
power, and often abused it.


Telecoms are worse than railroads. The railroads built twice as much
capacity as was needed, while the robber barons cashed out, over a period of
25 years. In telecoms, 20 times as much capacity was built as was needed,
and the cash-out period was 3 years. Railroads were substantially financed
by business speculators in Europe. Teleco's by the US public.

washingtonpost.com

Telecom Sector May Find Past Is Its Future
Giant Phone Companies Offer Stable, Well-Funded Option

By Peter S. Goodman
Washington Post Staff Writer
Monday, July 8, 2002; Page A01

...Investors poured large sums of money into telecommunications -- $880
billion from 1997 to date, according to Thomson Financial in New York. But
there were not enough phone calls or e-mails to sustain the hundreds of new
phone and Internet networks. As that reality emerged in the spring of 2000,
the great unraveling began.

No one knows how much of the investment -- $326 billion in stock and bonds,
plus $554 billion in bank loans -- has been destroyed, but it is surely a
huge sum. Half is as good a number as any, said Richard J. Peterson, chief
market strategist at Thomson Financial.

At least 63 telecommunications companies have landed in bankruptcy since
2000, according to Bankruptcydata.com.

...This enormous construction project cycled huge amounts of money through
the economy. Local and long-distance telephone companies spent $319 billion
building their networks from 1997 to 2001, said RHK Inc., a San Francisco
research firm. Mobile telephone companies spent more than $58 billion. The
money landed in the coffers of chip-making, software, computer and network
equipment companies.

...From October 1998 to February of this year, the transmission capacity
across the Atlantic expanded by a factor of 19. Meanwhile, the price of a
leased transmission line dropped to $10,000 a year from $125,000, said Eli
Noam, a professor of finance at Columbia University Business School.

FEBRUARY 7, 2002
NEWS ANALYSIS:TECHNOLOGY
By Alex Salkever
Business Week

...What happened? The numbers in the subsea cable business paint a stark
picture. From 1997 to 2001, trans-Atlantic cable capacity increased more
than 20-fold, according to TeleGeography, a telecom consultancy.
Trans-Pacific capacity soared 40-fold. As so many lines were laid, demand
for the services became diluted. Prices for wholesale bandwidth on land and
sea plunged apace, falling between 50% and 70% a year.

DIVING AND DIGGING.  Before Global Crossing launched in 1998, the standard
lifetime contract for 155 megabytes of capacity went for $20 million. Global
Crossing dropped that immediately to $8 million. By the end of 2001, that
same deal drew only $350,000. Long-term contracts no longer hold their
allure for customers, who now seek out more flexible one-year or two-year
leases.

FEBRUARY 4, 2003
Business Week
SPECIAL REPORT: ALL-DISTANCE TELECOM
Alex Salkever
Eating Asia's Broadband Dust
Unlike the halting and financially crippling rollout of high-speed access in
the U.S., in the Far East it has gone much faster and cheaper


Re: internet infrastructure investment data

2003-10-07 Thread Bill Lear
On Tuesday, October 7, 2003 at 13:35:03 (-0400) Kendall Grant Clark writes:
Folks,

I'm working on a technical book about a new way in which corporations are
using the Web to achieve the holy grail, enterprise application
integration, using a new family of technologies called Web
Services. The book is targeted at working programmers, so it will mostly
be that sort of technical content.

I would suggest that you look at Web Services in the broader context
of how corporations attempt to escape the market.  Why do they want
standard services in the first place and which players benefit?

Web Services seems to be just another mechanism for decoupling that
allows independent change of implementation, and (supposedly) some
sort of dynamic lookup of implementation.

You might look at Creating the Computer: Government, Industry,
and High Technology by Kenneth Flamm, and also his Targeting the
Computer: Government Support and International Competition.  However,
these precede the Internet revolution by a few years.

Perhaps also Business Organization and the Myth of the Market Economy
by William Lazonick.


Bill


Re: internet infrastructure investment data

2003-10-07 Thread joanna bujes


Web Services seems to be just another mechanism for decoupling that
allows independent change of implementation, and (supposedly) some
sort of dynamic lookup of implementation.
You might look at Creating the Computer: Government, Industry,
and High Technology by Kenneth Flamm, and also his Targeting the
Computer: Government Support and International Competition.  However,
these precede the Internet revolution by a few years.

Well, as it turns out, this is what I've been documenting and studying
for the last six years-- because I have to write programming books, that
teach engineers how to use the various standard API's that define these
web services. Broadly, the point of having de-coupled, componentized,
services is to make it easier to program distributed applications. The
demand for componentized applications that could be deployed on any
platform and operating system was more customer-driven than
engineering-driven. Engineers didn't mind writing huge, monolithic
applications that did not have to bridge heterogenous environments. But,
of course, if you wanted to redeploy such applications into a different
environment, you'd have to rewrite them. Expensive. So the notion of
transparent communications accross the net and of write once, run
anywhere applications became very important.
Computing, in general, cries out of standards and openness; capitalism
depends upon private property, of which intellectual property is a
part. So the development of computing is always pulled into these
completely contradictory directions.
I'm  not clear about how much technical background you have and so I
don't know what needs to be explained.
Try me at home, at 510 451-3109 if you run into troublesome stuff.

Joanna





Re: internet infrastructure investment data

2003-10-07 Thread Doyle Saylor
Greetings Pen-'Ellers,
KGC writes,
But there is an idea floating around geekdom that the Web works (in the
sense that it scales 5B+ documents, something which no one really
expected) because of various purely technological ideas (most of which get
attributed, inaccurately, to Tim Berners-Lee). I want to engage this idea in
my book (for my own nefarious, leftie political reasons) and my publisher is
cool with me doing a bit of politics of technology.

Me,
Clay Shirky writes about the economics of what makes the web work.  Has some
theories about various ideas floating around about the IT industry that are
a starting place to think about what works and doesn't work about Web
Services, etc.,.

http://www.shirky.com/

Hal Varian writes a column for the NY Times and teaches the economics of
information technology at UC Berkeley.  He may have some specifics for you
to track down about hardware spending versus, software ideas like Tim
Berners-Lee might represent in the public mind.

http://www.sims.berkeley.edu/~hal/

Varians University web site.

http://www.sims.berkeley.edu/resources/infoecon/

A site of his that gives research sites for information about the
information economy.

Doug Henwood has a new book out sometime in the next decade (has been
promised for more than a year so far) about the new economy in which he
gives an economic accounting of the basic area you are interested in.

Look up Ian Foster who is chief scientist on the GRID, which is a new
internet like technology for super computing on a large scale.  The cost of
hardware for this project where it is academic related is probably public
information.  Therefore you can get an idea about the relative cost of
building a new internet.  And this system probably is a requirement for the
success of Web Services in the long run, so it gives some insight now of
what Web Services has to bring along to work right.

This question is like asking what Open Source software brings in value to
business.  So you might look at Servers and costs selling them.  IBM servers
(along with other companies) are driving Sun Microsystems into the ground by
utilizing Open Source software.  This gives some idea of what theory (or
human labor) provides over hardware.  Especially look at how the relative
updating cost for Sun are higher than the brand new installation of IBM
servers. Not easy comparison, but perhaps gives some insight.

Look at labor costs overseas like India for IT because that makes theory
much cheaper to use.  Because that is what you mean by theory I think is
labor costs.

You might clarify your thinking about that issue of theory versus hardware
in technology terms also.  For example, historically for a lefty what is the
path toward programming?  Writing.  What about memory in computer?  The
public libraries.

Writing -
Let's take color in magazines (being print media closely tied to traditional
typescript), which gradually increased from the 1920's onward.  Color
represents a major increase in costs and production for photographs.
Throughout the 20th century color photographs were basically just one big
frill on the ass of the printing trade.  So when we talk about computing and
web services we might ask where the sheer productive volume of writing
theory actually is merited by Web Services.  Hal Varian gives some bench
marks about the sheer volume of information being produced, tv, x-rays,
written text etc.

So the value of theory can be understood in some ways by the general
increase in the volume of produced writing.  One can take radio
transmission, tv transmission etc. as fancy sorts of writing because they
transmit words also.  Some people argue that the value of that sort of stuff
declines to near nothing in the present computing environment.  Copying
costs being just about nil as Clay Shirky would argue.  However, the value
of theory in terms of writing would the vast increase of unit volume of
writing.  And because of that a transformation of the sheer structure of
writing in some analogy like black and white photos going over to color.  We
don't exactly foresee what makes a big increase in production of information
important, because our culture never had this option.  Printing in some ways
was a big increase, but the volume increase of memory coming, Terrabyte hard
discs, allows us to think in terms of tens of thousands of movies stored to
use in theory making.  So instead of the of few kilobytes on this list,
theory would entail a gigabyte structured into meaningful writing or
whatever people will end up calling what this points at.

So in that sense I am opposed to Doug Henwood's (amongst others) view of the
economics of theory and software in Information Technology.  I think theory
can actually be looked at in terms of volume of product attached to all
volume of production in economic terms.
Doyle


Re: internet infrastructure investment data

2003-10-07 Thread Kendall Clark
On Tue, Oct 07, 2003 at 04:25:03PM -0700, joanna bujes wrote:

 Computing, in general, cries out of standards and openness; capitalism
 depends upon private property, of which intellectual property is a
 part. So the development of computing is always pulled into these
 completely contradictory directions.

Yes, there are these tensions which pull in opposite directions; one of the
things I do as a weekly tech columnist is try to get it through the default
libertarian geek mind haze that capitalism, in fact, sucks. :

 I'm  not clear about how much technical background you have and so I
 don't know what needs to be explained.

Well, thanks, but I wasn't asking for a technical explanation at all. I'm a
faculty researcher at UMD in this area, so I'm pretty comfortable with the
tech. I was specifically asking for pointers to economic estimates about the
amount in dollars of infrastructural investement in, say, the period from
the beginnings of the Web, say 1994 to 1995, up to its full on privatization
and the end of the dot-com craze, so roughly 2001.

Sorry, but I wouldn't dream of asking an actual computer technical question
on PEL-L or LBO. :

Kendall


Re: internet infrastructure investment data

2003-10-07 Thread Kendall Clark
On Tue, Oct 07, 2003 at 05:58:09PM -0700, Doyle Saylor wrote:

 Me,
 Clay Shirky writes about the economics of what makes the web work.  Has some
 theories about various ideas floating around about the IT industry that are
 a starting place to think about what works and doesn't work about Web
 Services, etc.,.

 http://www.shirky.com/

Oops, but, but, Clay Shirky is a bit of a moron. (I worked on one of the
books he pub'd as an editor, and I've edited a couple of his tech articles,
so I think I can say this with some confidence). At any rate, I was looking
for actual infrastructure investment numbers, not other people's theories
about how or why the web works, or whatever.

 http://www.sims.berkeley.edu/~hal/

Ah, this may be a good resource. Thanks.

 You might clarify your thinking about that issue of theory versus hardware
 in technology terms also.  For example, historically for a lefty what is the
 path toward programming?  Writing.  What about memory in computer?  The
 public libraries.

Huh? You really lost me here. My question was way simpler than that. The
fact that the Web scales to 5B+ documents is a surprise to computer
scientists. One of the prevailing explanations is that HTTP got smarter
(basically, it became more cachable by intermediaries and proxies) and that
those technical changes (the changes aren't *theoretical* or theory, and I
didn't imply that) were the critical change which has let the Web scale to
5B+ documents.

I think it's probably more likely that the Web scales because we spent a
couple hundred billion dollars on telecoms infrastructure, new routing 
packet switching technologies and capacities, etc. So I was hoping to find
some research that quantifies the amount of salient investment during the
relevant period.

Sorry if I wasn't clear.

Thanks,
Kendall Clark


Re: internet infrastructure investment data

2003-10-07 Thread ravi
Doyle Saylor wrote:
 Greetings Pen-'Ellers,
 KGC writes,
 But there is an idea floating around geekdom that the Web works (in the
 sense that it scales 5B+ documents, something which no one really
 expected) because of various purely technological ideas...


i could use some clarification of the statement above. does it mean that
in geekdom there is an idea that the web works *only* because of
technological ideas? if not, then the claim is a truism isnt it?

some additional questions:

1. what is the web? is it the internet + the various web servers and
   documents that they serve?

2. what does scaling to 5b+ documents mean? 5b+ html files stored
   somewhere on networked computers? 5b+ documents transmitted in
   parallel (i.e., capacity)?

--ravi


Re: internet infrastructure investment data

2003-10-07 Thread ravi
Kendall Clark wrote:

 Yes, there are these tensions which pull in opposite directions; one of the
 things I do as a weekly tech columnist is try to get it through the default
 libertarian geek mind haze that capitalism, in fact, sucks. :


what does default libertarian geek mind mean? that by default you
assume all geeks are libertarians? or that you have found them to be so?
or that something about geek culture (if there is such a single thing)
naturally coincides with libertarian philosophy?


 Sorry, but I wouldn't dream of asking an actual computer technical question
 on PEL-L or LBO. :


why not?

--ravi


Re: internet infrastructure investment data

2003-10-07 Thread ravi
Kendall Clark wrote:
 Huh? You really lost me here. My question was way simpler than that. The
 fact that the Web scales to 5B+ documents is a surprise to computer
 scientists. One of the prevailing explanations is that HTTP got smarter
 (basically, it became more cachable by intermediaries and proxies) and that
 those technical changes (the changes aren't *theoretical* or theory, and I
 didn't imply that) were the critical change which has let the Web scale to
 5B+ documents.


i posted this already, but i will repeat my question: could you explain
further what you mean by the web scales to 5b+ documents? and who are
the computer scientists who are surprised by this?

there are some interesting questions with respect to scaling and the web
-- for instance, the issue of scaling individual servers w.r.t
simultaneous access. for eg see the c10k problem/project:
http://www.kegel.com/c10k.html -- is this the sort of thing you are
talking about? but your messages are a bit confusing...

--ravi


Re: internet infrastructure investment data

2003-10-07 Thread Kendall Clark
On Tue, Oct 07, 2003 at 10:16:51PM -0400, ravi wrote:

  But there is an idea floating around geekdom that the Web works (in the
  sense that it scales 5B+ documents, something which no one really
  expected) because of various purely technological ideas...
 

 i could use some clarification of the statement above. does it mean that
 in geekdom there is an idea that the web works *only* because of
 technological ideas? if not, then the claim is a truism isnt it?

No, it means that many technical people believe the Web *still* works at the
present scale because of some specific changes that were made to the HTTP
protocol. That is, these folks give no credence to the alternative
explanation that, even w/out those specific technical changes, the Web would
work at the present scale because of massive infrastructural investment --
which undoubtedly happened, I'd just like some kind of reasonably accurate
estimate of its value in dollars.

 some additional questions:

 1. what is the web? is it the internet + the various web servers and
documents that they serve?

That's a good question. I mean in this case it's that part of the Internet
which happens via HTTP, server  client. It's a significant percentage of
total Internet usage.

 2. what does scaling to 5b+ documents mean? 5b+ html files stored
somewhere on networked computers? 5b+ documents transmitted in
parallel (i.e., capacity)?

Neither, actually. There are something like 5B+ addressable resources
(things which have URIs), but they aren't all HTML files on servers, many
of them are resources which are computed on-the-fly. And I doubt anyone
believes that all of these resources could be simultaneously requested.

Anyway, irrespective of the overall context of my question, what I'm really
trying to figure out is the magnitude of infrastructural investment.

Kendall
--
Nobody said it was easy
No one ever said it would be this hard
Oh take me back to the start
--Coldplay, The Scientist

Jazz is only what you are. -- Louis Armstrong

Do you realize
that happiness makes you cry?
   -- The Flaming Lips


Re: internet infrastructure investment data

2003-10-07 Thread Kendall Clark
On Tue, Oct 07, 2003 at 10:20:11PM -0400, ravi wrote:
 Kendall Clark wrote:
 
  Yes, there are these tensions which pull in opposite directions; one of the
  things I do as a weekly tech columnist is try to get it through the default
  libertarian geek mind haze that capitalism, in fact, sucks. :
 

 what does default libertarian geek mind mean? that by default you
 assume all geeks are libertarians? or that you have found them to be so?

I mean that the dominant ideology among the geek set (well, large chunks of
it anyway, it's probably not more monolithic than any other subculture) is
strong right libertarian, especially on the issue of where technology comes
from. It's *not* a David Noble-friendly part of the world, at least as I
have experienced it. (And, yes, I do tend to assume that most geeks are
right libertarians, given the dominant ideology, but it's a loose assumption
which I stand ready to modify. Anyway, not sure how this is relevant...)

  Sorry, but I wouldn't dream of asking an actual computer technical question
  on PEL-L or LBO. :

 why not?

Because it's completely off-topic? Isn't that obvious?

Kendall


Re: internet infrastructure investment data

2003-10-07 Thread Kendall Clark
On Tue, Oct 07, 2003 at 10:25:21PM -0400, Kendall Clark wrote:

 Kendall
 --
 Nobody said it was easy
 No one ever said it would be this hard
 Oh take me back to the start
 --Coldplay, The Scientist
...

Oops, my apologies for not trimming my .signature.

Kendall


Re: internet infrastructure investment data

2003-10-07 Thread Kendall Clark
On Tue, Oct 07, 2003 at 10:25:57PM -0400, ravi wrote:

 i posted this already, but i will repeat my question: could you explain
 further what you mean by the web scales to 5b+ documents? and who are
 the computer scientists who are surprised by this?

I've already explained it, so I won't do so again. I'm not gonna go 'round
and 'round about this, Ravi, since it's not really germane to my question.
I'm starting to regret including any surrounding context.

 there are some interesting questions with respect to scaling and the web
 -- for instance, the issue of scaling individual servers w.r.t
 simultaneous access.

Yes, of course. But that's not what I meant, but I suggest we move this
off-list if you want to talk about it further. Those specific issues aren't
relevant to my real question.

Kendall Clark


Re: internet infrastructure investment data

2003-10-07 Thread Doyle Saylor
Greetings Pen-L,
KGC writes,
Oops, but, but, Clay Shirky is a bit of a moron.

Doyle,
Couple of things, while for you the term moron is simply a label that
indicates you think Shirky is not interesting, for me as a disability rights
advocate I find the term anti-disabled.  If you read Stephen Jay Gould's
book on The Mismeasure of Man you get a decent insight on this made-up
word.  The basic concept from the early nineteen hundreds in the IQ
'science' underlying the word moron was a person too stupid to learn how to
read.  The science behind the concept was dismantled by Gould.  So the term
moron while associated in the public mind with developmentally disabled
persons is simply empty of meaning because it has not scientific validity.

As to your personal insight into Shirky, I always thought Bush was not
intellectually able, but I don't dwell on labeling him stupid because that
is an empty way of trying to understand what is going on.  Just a brief
reaction to your wording about Shirky.

Shirky represents an influential part of the IT industry accessible and
available for you to read.  How some of that school analyze their industry
bears upon your request. Since you know him well that offering from me is
irrelevant to your question.  In fact too bad he was not a pleasure to work
with and brilliant.  Life is so short to waste upon someone whom one does
not respect.

I would say though you can't argue that investment in the telecom industry
is what made things scale up to 5B + documents, if people didn't use the
internet as well, it was after all for a couple of decades just a back water
in the sciences community.  If you are meaning 5B+ (billion plus) documents
I am struck by this statistic that there are roughly one document on the web
to every five hundred documents in private intranet resources.  So I think
about these things in terms of public and private intellectual property.

As far as that goes, it is interesting that investment had three roughly
parallel tracks in the telecom world.  The U.S., Europe, and Asia.  If
scaling up is a key issue, how does each region differ?

you write,
One of the prevailing explanations is that HTTP got smarter
(basically, it became more cachable by intermediaries and proxies) and that
those technical changes (the changes aren't *theoretical* or theory, and I
didn't imply that) were the critical change which has let the Web scale to
5B+ documents.

me,
well I read this comment in your previous email which says;

you wrote,
because of various purely technological ideas (most of which get
attributed, inaccurately, to Tim Berners-Lee)

me,
Which sounds to me like you are going to write about technology ideas (and
implementation) and you just don't agree that was important for the web in
relation to the infrastructure built during the great bubble economy.  But
you seem not so much intent on validating theory or ideas as important for
5B + documents that was made possible by the investment in infrastructure of
support for the web.  So you are downgrading the intellectual labor process
that goes into the web by dwelling on the machinery behind it.  Maybe that
isn't your intent, but strikes me that way.

you write in the previous email,
My suggestion will be that of at least *equal importance* to these
technical fixes (having mostly to do with the differences between version
1.0 and 1.1 of the HTTP protocol, for anyone who cares) is the massive
influx of investment dollars to beef up the infrastructure of the
Internet, most of which the Web benefited from.

me,
This reads to me like you have a thing about the W3C (world wide web
consortium) being over blown in value.  And the machinery and spending on
the infrastructure as much more important.  Roughly like saying the auto
industry spends a gazillion dollars a year on plants and infrastructure and
the labor of the auto workers is sort of secondary.  May not be what you
mean, but I get a little bit of that sort of message here also as well as
above.

you write,
Huh? You really lost me here.

me,
One analogy that Doug Henwood uses to good effect about the relative lack of
change between the nineteenth century and the twentieth century is that he
points out during the nineteenth century with the telegraph wires
communication leapt from an era of foot travel to instant communications
around the world was unprecedented in world history, while one could look at
computing communications as a much less spectacular addition to the human
culture.  To me if you are going to talk about so-called 3rd generation
telecom industry and what it's meaning was as a lefty, you must pay
attention to the historical precedents as Doug Henwood did to get your
'ideas' across.  Even if I think you are off the beam I get a lot out of a
capable person writing in depth including having a historical sense of time
and place.

I hope I gave you some value for your request for advice.  I was trying to
be helpful.
thanks,
Doyle