I remember it quite well...
http://tech.slashdot.org/story/13/01/03/232259/supercomputer-repossessed-by-state-may-be-sold-in-pieces
With conflicted (but not mixed) feelings. I (all but) bid on providing
the visualization gateways for the system as my first major project
after I left LANL in 08... I"m glad i saw the writing on the
dysfunctional wall of state procurement as well as the ill conceived
nature of the whole project. I never cared much for our former
governor and didn't think he understood much outside of politics, this
project being one of the more obvious follies (to me). But don't
mistake this for me agreeing with Martinez's knee-jerk attempts to
dismantle everything he did. I think the Supercomputer initiative was
well intentioned, and maybe at one level of understanding, very promising.
But frankly the main things "Big Iron" are good for are creating an
expensive sandbox to spend money in. You have to have a lot of money to
spend to even begin to use them effectively... mostly programmers and
scientists-cum-programmers to make proper use of them (and machine rooms
with lots of power and environmental control and...) and staff to keep
them running and up to date and ... The state (whether as a government,
a collection of academic institutions, a budding commercial venue for
high tech, thousands of entrepreneurs, etc.) simply didn't (and doesn't)
have the kind of oomph it needs. Sandia and LANL and NCGR all use
UberComputing about as effectively as anyone, but have huge staff and
budgets to make that happen.
I had worked in "big iron" shops most of my career, never really
believing in them. While I *do* think some important things were done
because there was big iron at places like LANL, I think the bulk of the
"important things" happened first on the smaller machines (Vax with VMS
or BSD) in the 80's and then the plethora of Scientific Workstations
(e.g. Sun, SGI, Apollo, HP, NeXT, etc.) in the 90's and ultimately the
PCs running Linux and the mini-clusters that grew from them.
Even though I worked with and on the big iron of different Generations
(CDC/Cray/TMC/*NIX-cluster-of-the-month) and even built a few utility
Linux clusters, I never believed that the roughly single or double order
of magnitude increases led to many qualitative advances in computing or
science. There certainly have been *some* important advances made, the
most obvious (in my uneducated opinion) might have been in
bioinformatics. Generally, the value seems to have been in
embarassingly parallel problems where there was funding to pay for the
"big iron" and a clear value to shortening the time to an answer by a
couple of orders of magnitude (like getting an answer in a day that
otherwise might take a week or even a few months).
I think some Science was accellerated quite well by that kind of
leverage. But in other fields it simply became an excuse for bloated
budgets and distracting scientists from their science and making
(letting?) them become computer scientists. There is plenty of
precedent for this as early as the 40's and 50's which I respect..
modern computing might not exist were it not for those early "ACS"
(MANIAC, ILLIAC, etc)
It may seem contradictory, but I *do* believe all that flailing that I
observed (and too often participated in) with big iron and hordes of
small iron (clusters) and DYI/NIH development (from OSs to hardware to
text editors for crimeny sake!) was an important early seed for much of
our current consumer, entertainment and hobbyist-driven computing.
While *games* may have really fueled the graphics cards, it was SGI that
really got it all moving in the right direction in the first place...
getting over the early hurdles.
I'm not an expert on the Space Program but while Tang, Space Blankets
and zero-G ballpoints might be the more obvious but mundane (trite?)
spinoffs, there are also more impactful spinoff technologies like Velcro
and photovoltaics and heat pipes and bears, oh my! to refer to.
Similarly the huge (ginormous?) budgets that Defense and Energy put into
uber-computing over decades, have had valuable side-effects... but I
never believed that a *State* could achieve the same thing. Maybe the
Japanese or Chinese "state", but not NM...
Nevertheless I *am* sympathetic with those who really, really (really)
wanted it to work. But I am not sympathetic with the Martinez gang who
have been using every opportunity to bash the previous administration.
I think this particular failure is real, but I think the fanfare around
*demolitioning* it is totally politics-driven hype of the worst kind.
Yes, the gear is vintage if not antique and there is unlikely any
*commercial* market for it. I'm not sure of all the implications of
"selling" it in pieces to the Universities (State run) but it seems
likely the funding to "buy" it comes out of the same pocket that it goes
back into when sold. This might be a useful a useful bookkeeping
fiction, but I suspect it is another
Richardson-bashing/Martinez-grandstanding opportunity.
I don't really agree with Owen on the presumption that such resources
can't be used effectively without a fat pipe all the way into our houses
(or offices)... Remote X, VNC, etc. make it pretty easy to do 99% of
what you need to do without ever bringing the bulk of the data back over
the net. I too romanticize having a direct drop on a Tbit/sec at my
dinner table but I don't think the lack of it explains my lack of
utilization of big iron (whether in Rio Rancho, Los Alamos or Mountain
View).
The availability of the Amazon Cloud and/or relatively affordable price
of a densely packed GPU/CPU mini-cluster challenges us all to put our
projects where our mouth is and actually implement some effective
parallel algorithms that can do the heavy lifting. The tools are there
to make this 100 times easier than it ever was when I was
learning/developing the tricks of the trade... and it is still hard.
My only words of wisdom on the topic might be that instead of limiting
ourselves to well known "embarrassingly parallel" algorithms or swimming
upstream trying to force fit intrinsically serial algorithms into
parallel environments, we should look to discovering (recognizing,
inventing?) uniquely different approaches. This is what the nonlinear
and complexity science movement of the 80's did in it's own way to
reconfigure formerly intractable (intellectually as well as
computational) problems into tractable, and sometimes even *elegant*
problems with similarly elegant solutions.
- Steve
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com