[NY Times]
July 15, 2003
Teaching Computers to Work in Unison
By STEVE LOHR
Computers do wondrous things, but computer science itself is largely a
discipline of step-by-step progress as a steady stream of innovations in
hardware, software and networking pile up. It is an engineering science
whose frontiers are pushed ahead by people building new tools rendered in
silicon and programming code rather than the breathtaking epiphanies and
grand unifying theories of mathematics or physics.
Yet computer science does have its revelatory moments, typically when
several advances come together to create a new computing experience. One
of those memorable episodes took place in December 1995 at a
supercomputing conference in San Diego. For three days, a prototype
project, called I-Way, linked more than a dozen big computer centers in
the United States to work as if a single machine on computationally
daunting simulations, like the collision of neutron stars and the movement
of cloud patterns around the globe.
There were glitches and bugs. Only about half of the 60 scientific
computer simulations over the I-Way worked. But the participants recall
those few days as the first glimpse of what many computer scientists now
regard as the next big evolutionary step in the development of the
Internet, known as grid computing.
It was the Woodstock of the grid - everyone not sleeping for three days,
running around and engaged in a kind of scientific performance art, said
Dr. Larry Smarr, director of the California Institute for
Telecommunications and Information Technology, who was the program
chairman for the conference.
The idea of lashing computers together to tackle computing chores for
users who tap in as needed - almost as if a utility - has been around
since the 1960's. But to move the concept of distributed computing
utilities, or grids, toward practical reality has taken years of
continuous improvement in computer processing speeds, data storage and
network capacity. Perhaps the biggest challenge, however, has been to
design software able to juggle and link all the computing resources across
far-flung sites, and deliver them on demand.
The creation of this basic software - the DNA of grid computing - has been
led by Dr. Ian Foster, a senior scientist at the Argonne National
Laboratory and a professor of computer science at the University of
Chicago, and Dr. Carl Kesselman, director of the center for grid
technologies at the University of Southern California's Information
Sciences Institute.
They have worked together for more than a decade and, a year after the San
Diego supercomputing conference, they founded the Globus Project to
develop grid software. It is supported mainly by the government, with
financing from the Department of Energy, the National Science Foundation,
NASA and the Defense Advanced Research Projects Agency.
There has been a flurry of grid projects in the last few years in the
United States, Europe and Japan, most of them collaborations among
scientific researchers at national laboratories and universities on
projects like climate modeling, high-energy physics, genetic research,
earthquake simulations and brain research. More recently, computer
companies including IBM, Platform Computing, Sun Microsystems,
Hewlett-Packard and Microsoft have become increasingly interested in grid
technology, and some of the early commercial applications include
financial risk analysis, oil exploration and drug research.
This month, grid computing moved further toward the commercial mainstream
when the Globus Project released new software tools that blend the grid
standards with a programming technology called Web services, developed
mainly in corporate labs, for automated computer-to-computer
communications.
Enthusiasm for grid computing is also broadening among scientists. A
report this year by a National Science Foundation panel, Revolutionizing
Science and Engineering Through Cyberinfrastructure, called for new
financing of $1 billion a year to make grid-style computing a routine tool
of research.
The long-term grid vision is that anyone with a desktop machine or
hand-held computer can have the power of a supercomputer at his or her
fingertips. And small groups with shared interests could find answers to
computationally complex problems as never before.
Imagine, for example, a handful of concerned citizens running their own
simulation of the environmental impact of a proposed real-estate
development in their community. They wouldn't need their own data center
or consultants. They would describe what they want, and intelligent
software would find the relevant data and summon the computing resources
needed for the simulation.
The ultimate goal is a fundamental shift in how we go about solving human
problems, and a new way of interacting with technology, Dr. Kesselman
said.
That grand vision, however, is years away, perhaps a decade or more. Dr.
Smarr is the former director of the National Center for