Re: the next frontier of 'privatization'

2003-07-16 Thread ravi
Eubulides wrote:

 Computers do wondrous things, but computer science itself is largely
 a discipline of step-by-step progress as a steady stream of
 innovations in hardware, software and networking pile up. It is an
 engineering science whose frontiers are pushed ahead by people
 building new tools rendered in silicon and programming code rather
 than the breathtaking epiphanies and grand unifying theories of
 mathematics or physics.


funny, that the article starts out on the right note (by pointing out
that computer science is not much of a science) and then flip flops
between acknowledging the accidental, incrimental and non-academic
nature of computer technology (at least software/network part of it) and
the urge to make grand pronouncements about the future.

i am snipping all the amazing stuff about grid computing, to which my
response is this: in the 80s and early 90s, all the hype surrounding
computer science was artificial intelligence. impressive systems were
demonstrated (expert systems were just the tip of the iceberg, a mere
rough implementation) and various gurus of the future were proclaimed.
the govt of japan, one is told, gambled its technological future to a
fifth generation project. in the meantime, a mundane technology, the
internet, based on the notion of the stupid network
(http://www.isen.com/stupid.html), founded on pragmatic fairly
non-theoretical principles, proved to be the revolution in computer
technology in the 90s.

of course computer science was above such mundane things (which
perhaps explains the sparse contribution of bell labs, with its closed
ivory tower attitude, to the emerging technology, despite sowing the
seeds with unix back in the 60s). the hot topic in computer science
during the 90s was quantum computing. it was going to revolutionize
the science.

the we had the replacement of technology with terminology: B2B, P2P,
enterprise computing, extranet, web services, data mining, data
warehousing, etc.

somehow in the midst of all of this, actual progress is achieved! ;-)
whether grid computing is fancy technology, fancy terminology, or
something that will actually be a valuable new service on the net, i
hope will be decided through its adoption, not through its proclamation!


 Computer scientists say the contribution of Dr. Foster and Dr. Kesselman
 to grid computing is roughly similar to that made by Tim Berners-Lee to
 the development of the Web. Mr. Berners-Lee, who is now the director of
 the World Wide Web Consortium at the Massachusetts Institute of
 Technology, came up with the software standards for addressing, linking
 and sharing documents over the Web: U.R.L.'s (uniform resource locators),
 HTTP (hypertext transfer protocol) and HTML (hypertext mark-up language).


and one must remember that berners-lee was no computer scientist and
should really be remembered mostly for being at the right place at the
right time. the good thing about the evolution of the internet is that
unless you are a techie, you see very few names associated with the
evolution of the technology. the notion of hyperlinked distributed
information predates berners-lee. even in his own time systems such as
gopher and WAIS provided similar services. and nobody in the general
public knows the names of the authors of these technologies (or might
recognize, outside of marc andressen, names such as eric bina, people
who actually wrote the code for mosaic, the browser that was key to the
success of the web), leave alone the authors of such devices as the
berkeley socket library that led to these applications.

--ravi


the next frontier of 'privatization'

2003-07-15 Thread Eubulides
[NY Times]
July 15, 2003
Teaching Computers to Work in Unison
By STEVE LOHR


Computers do wondrous things, but computer science itself is largely a
discipline of step-by-step progress as a steady stream of innovations in
hardware, software and networking pile up. It is an engineering science
whose frontiers are pushed ahead by people building new tools rendered in
silicon and programming code rather than the breathtaking epiphanies and
grand unifying theories of mathematics or physics.

Yet computer science does have its revelatory moments, typically when
several advances come together to create a new computing experience. One
of those memorable episodes took place in December 1995 at a
supercomputing conference in San Diego. For three days, a prototype
project, called I-Way, linked more than a dozen big computer centers in
the United States to work as if a single machine on computationally
daunting simulations, like the collision of neutron stars and the movement
of cloud patterns around the globe.

There were glitches and bugs. Only about half of the 60 scientific
computer simulations over the I-Way worked. But the participants recall
those few days as the first glimpse of what many computer scientists now
regard as the next big evolutionary step in the development of the
Internet, known as grid computing.

It was the Woodstock of the grid - everyone not sleeping for three days,
running around and engaged in a kind of scientific performance art, said
Dr. Larry Smarr, director of the California Institute for
Telecommunications and Information Technology, who was the program
chairman for the conference.

The idea of lashing computers together to tackle computing chores for
users who tap in as needed - almost as if a utility - has been around
since the 1960's. But to move the concept of distributed computing
utilities, or grids, toward practical reality has taken years of
continuous improvement in computer processing speeds, data storage and
network capacity. Perhaps the biggest challenge, however, has been to
design software able to juggle and link all the computing resources across
far-flung sites, and deliver them on demand.

The creation of this basic software - the DNA of grid computing - has been
led by Dr. Ian Foster, a senior scientist at the Argonne National
Laboratory and a professor of computer science at the University of
Chicago, and Dr. Carl Kesselman, director of the center for grid
technologies at the University of Southern California's Information
Sciences Institute.

They have worked together for more than a decade and, a year after the San
Diego supercomputing conference, they founded the Globus Project to
develop grid software. It is supported mainly by the government, with
financing from the Department of Energy, the National Science Foundation,
NASA and the Defense Advanced Research Projects Agency.

There has been a flurry of grid projects in the last few years in the
United States, Europe and Japan, most of them collaborations among
scientific researchers at national laboratories and universities on
projects like climate modeling, high-energy physics, genetic research,
earthquake simulations and brain research. More recently, computer
companies including IBM, Platform Computing, Sun Microsystems,
Hewlett-Packard and Microsoft have become increasingly interested in grid
technology, and some of the early commercial applications include
financial risk analysis, oil exploration and drug research.

This month, grid computing moved further toward the commercial mainstream
when the Globus Project released new software tools that blend the grid
standards with a programming technology called Web services, developed
mainly in corporate labs, for automated computer-to-computer
communications.

Enthusiasm for grid computing is also broadening among scientists. A
report this year by a National Science Foundation panel, Revolutionizing
Science and Engineering Through Cyberinfrastructure, called for new
financing of $1 billion a year to make grid-style computing a routine tool
of research.

The long-term grid vision is that anyone with a desktop machine or
hand-held computer can have the power of a supercomputer at his or her
fingertips. And small groups with shared interests could find answers to
computationally complex problems as never before.

Imagine, for example, a handful of concerned citizens running their own
simulation of the environmental impact of a proposed real-estate
development in their community. They wouldn't need their own data center
or consultants. They would describe what they want, and intelligent
software would find the relevant data and summon the computing resources
needed for the simulation.

The ultimate goal is a fundamental shift in how we go about solving human
problems, and a new way of interacting with technology, Dr. Kesselman
said.

That grand vision, however, is years away, perhaps a decade or more. Dr.
Smarr is the former director of the National Center for