We are on the verge of seeing a big leap in affordable processing power from
inexpensive computers. Essentially, what was a $10 million Cray of a decade
ago is now available for the teenage "gamer" ... just as the $10 million IBM
360 evolved into the PC, but this time it is qualitatively different.
Virtual reality is on the horizon, as well as machine "learning" and
human-like visual recognition. 

http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-GeForce-GTX-TITAN-Preview
-GK110-GPU-Boost-20-Overclocking-and-GPGPU

And closer to home for some vorticians,  this massive level of computer
power will be capable of controlling a (formerly) low level task - such as
say, a small reactor - if needed. The obvious question is: why and how would
you need it? DSPs and PICs and Arduinos are pretty cheap already; and they
can control dozens of interlocking parameters without breaking into a sweat.


Yes, the knee-jerk reaction is - you do not need a supercomputer to do the
work of a DSP.... but don't forget that in the 1960s- many experts at IBM
could not envision the need for the PC. An expert of today does not need to
be a contrarian to suspect that maybe... just maybe...  we will "invent the
need".  An answer in greater detail for that proposition (the emergent need
for the cheap supercomputer) is likewise certainly not obvious. But the
point is - like so many things in modern technology - the best application
for a new device often blindly emerges (to the surprise of all), almost as
an afterthought - following the introduction of the enabling product. 

This is the reverse of tradition where 'necessity' is the mother of
invention. The major paradigm shift we are seeing nowadays in applied
science is that invention is no longer "need driven" so much as opportunity
driven. As they say in the flicks - "if you build it, they will come".
Anyway back to the "moonshine" of a supercomputer controlling a gainful
energy process...  one immediate but general suggestion for how it would
fit-in involves the so-called "Maxwell's demon" ... which is a smart device
that can select molecules from the Boltzmann's tail of an energy
distribution, and move them non-randomly- thereby deriving net energy from
ambient conditions. An Arduino could probably control a few dozen I/O
channels - but what if one seeks to control a few million? Yes that shifts
the "invention" part of the equation to providing secondary sensor arrays -
which are non-existent today but still ... "visual recognition" in the human
context requires massive computer power, and this could be the initial use,
especially if the computer is self-learning.

There is another application for LENR, specifically (and the reason for this
post) but it is  based on the hypothesis of gain particularly in NiH
reactions coming from the high end of the mass distribution for protons. I
have not convinced many observers that this hypothesis is accurate (that
hydrogen mass is not quantized, except as an ideal value like the Bohr atom)
... so it will be a hard sell to convince a VC or angel funder of the need
to develop a supercomputer subsystem for optimizing gain from this
hypothesis... but that may happen, quien sabe? 

We are reaching the tipping-point in the appreciation of the societal harm
caused by fossil fuel, economic harm more so than climate change. In fact,
the "appreciation of the threat" - may be the "necessity" which is the new
mother of a two-tiered invention process, which also is co-driven by the new
enabling technology, but at a level which in beyond serendipitous ... equal
parts 'perspiration' and 'inspiration' but with the information-processor
itself defining the major limitation.

Jones 




<<attachment: winmail.dat>>

Reply via email to