[Vo]:OT: New supercomputer is a rack of PlayStations

2008-02-28 Thread Jones Beene
This is getting seriously off-topic for alternative
energy (or maybe not!)...

But a techno-Geek Vortician sent me info about the
availability NOW of teraflop desktop supercomputers
and servers. If you have the buck$ for a parked
Beamer, say, but would rather have a 24/7 internet
screamer, go for it!

Actually the price of entry into terabyte computing
has dropped in the past 4 years from $25
million(minimum) to less than $5,000, and will likely
continue to exceed Moore's Law for a while. 

Nevertheless, any of us can still afford to wait a few
years, since the best use would be full speech
recognition with parsing (as opposed to voice-to-text
only) and this is not ready yet.

A few months ago, NVIDIA, a company noted for graphics
processing, not CPUs - figured out a way to combine
multithreaded parallel graphics chips to do some
incredibly powerful and versatile computing- of the
very same kind which the human brain also does best. 

Their well-named Tesla processors can be linked
together as blade servers. The Tesla S870, has a
retail price of $12,000, and will be packed with four
x8 series GPUs and will do 500 gigaflops per GPU. With
this server, you get 2 teraflops for not all that
much, but in two years, when the novelty has worn off,
look for the same thing for under $2,500.

Marvin Minsky, at one time claimed that the human
brain is a one teraflop-equivalent analog computer,
but he caught so much flak from Roger Penrose and
others, that he raised his estimate (so as not to be
too embarrassing to humans?). Something similar
happened when the horsepower was designated as a
comparative unit. I copied a Wiki entry on James Watt
and the naming of the HP below, mainly for its
historical value to word-phreaks. 

However, I agree with those in AI who opine that a
doctorate level of human brain-power will likely
require 10 teraflops, once the necessary software is
available.

This fall, a graphics card for any computer, called
the Tesla C870 will cranks out 500 gigaflops will sell
for $1499. But as we all continue to lament
(especially those of us with deficient typing and
spelling skills), the best use for this kind of
computing power, outside of acedemia, would be to
dispense with the keyboard altogether; yet speech, or
even accurate voice recognition, is not perfected, and
parsing the words into true actionable meaning is
even further away.

Jones


History of the horsepower (paraphrased from Wiki)

... straight from the 'horse's mouth', so to speak.

The term horsepower was coined by James Watt to help
market his improved steam engine. He had previously
agreed to take royalties of one third of the savings
in coal for this engine, but that scheme did not work
with customers who used horses instead. 

Watt determined that a horse could turn a mill wheel
144 times in an hour. The wheel was 12 feet in radius,
therefore the horse travelled 2.4 × 2#960; × 12 feet
in one minute. Watt judged that the horse could pull
with a force of 180 pounds. This all was rounded to
33,000 ft·lbf/min.

Engineering in History recounts that Smeaton
estimated that an average horse could produce
22,916-foot-pounds per minute over time. Desaguliers
increased that number, but Watt standardized the
figure at 33,000.

Put into perspective, a healthy human can produce
about 1.2 hp briefly, in a sprint - and sustain about
0.1 hp indefinitely; and trained athletes can manage
up to about 0.3 horsepower for a period of several
hours.

Most observers familiar with horses estimate that Watt
was intentionally optimistic and wanted to over
deliver with his replacement; and that few horses can
maintain the one HP effort for long. Regardless,
comparisons of machines to horses proved to be an
enduring marketing tool.





Re: [Vo]:OT: New supercomputer is a rack of PlayStations

2008-02-28 Thread Jed Rothwell

Jones Beene wrote:

Actually the price of entry into terabyte computing has dropped in 
the past 4 years from $25 million(minimum) to less than $5,000, and 
will likely continue to exceed Moore's Law for a while.


I believe that is because recent breakthroughs are mainly in 
massively parallel processing (MPP) software, rather than hardware. 
Hardware is dependent on Moore's law, but software is not. MPP 
software was very difficult to develop and it lagged for many years, 
so you might say progress was held back and it is now catching up.


Google has made the largest contribution to this software. Their 
equipment in the aggregate constitutes the world's largest MPP 
supercomputer. When you look back at the grand early supercomputers 
such as Illiac, it is kind of a letdown to realize that the world's 
most impressive computer today is held together with Velcro, and used 
mainly for advertising.


- Jed