>From Intel’s website: "The term 'thermal management' refers to two major 
elements: a heatsink properly mounted to the processor, and effective
airflow through the system chassis. The ultimate goal of thermal management 
is to keep the processor at or below its maximum operating temperature."

Robert Rosen defines programmable machines as images or models of 
nonprogrammable matter. “Logical depth,” developed by Charles Bennett at 
IBM, is a standard measure of information in network administration, digital 
security, and other fields. It measures the complexity of outputs - of whatever 
sort: numbers, images, etc. - qua underlying algorithmic processes. Bennett 
proposes depth as a "formal measure of value." In the background is a notion 
adapted from Turing’s theory of computation, which takes the complexity of 
algorithmic 
output and the noncompressibility of algorithms as models of complexity in 
general.
Complexity is value. "The value of a message thus appears to reside not in its
information (its absolutely unpredictable parts), nor in its obvious redundancy
(verbatim repetitions, unequal digit frequencies), but rather what might be 
called its
buried redundancy - parts predictable only with difficulty, things the receiver 
could
in principle have figured out without being told, but only at considerable cost 
in
money, time, or computation." The outer limit of logical depth, however, is 
physical
complexity. Despite the physical Church-Turing thesis, which posits that we can
model all entities algorithmically, physical hardware proves to be incomputable.
Instead, it is the non-programmability of this physical dimension that enables 
the
spaces and images of computability. This physicality is an absolute depth and 
the
basis against which value is measured, copyrighted and marketed.

By contrast, reversible computing or thermodynamic computing asks whether the
energy dissipation in running a computer ultimately balances out with the
computation produced. The simplest operation of AND or OR, even at the
assembler level, involves destruction of information and expenditure of energy.
Copying data would seem to increase information and deleting data to reduce it, 
but
this is not the case. The computation involved in deletion still necessarily 
expends
energy. So, tallying up the balance becomes complex. Following Landauer’s
Principle, if computation were shown not to expend energy, the result would be a
time-reversible process, a violation of the second law of thermodynamics, and 
the
triumph of the digital over the analog. Computation would be a net gain, the
super-production of immaterial, timeless, and virtual information. It turns out 
not 
to be the case, despite the near reversibility of today’s processors and despite
evidence for reversibility at the quantum level. Before there is digital 
simulation,
there must be flow. The mere need to power the computer, to flow electricity 
into
the material shell that simulates the digital space, establishes the energy 
spent and
the non-reversibility of computation. 

My laptop is scarred and stained and chipped. Its solidity is a trap for time. 
I touch 
the screen. The cool surface is a non-revsersible gradient of energy 
dissipation.
 
Boron, arsenic, phosphorus, antimony: these dopants are impurities added to a
semiconductor lattice to alters its electrical properties. The resulting shift 
in the
Fermi level of the semiconductor makes the semiconductor into a storage medium.
Trace radionuclides generated from the ceramic packaging used as the substrate 
in computer chips can change the contents of a computer memory unpredictably.

For non Top Secret information, NISPOM prescribes overwriting data in 3 
passes: with a character, then its complement, and finally with a random 
character; e.g., overwrite first with 0000 0000, followed by 1111 1111, then 
1001 0111. Gutmann suggests overwriting data in 35 passes (or any number of
passes) will not necessarily make it harder for laboratories, using a type of 
scanning electron microscopes say, to recover data from magnetic storage media.
He also suggests that the methods that the DoD actually use are different to 
those
that they recommend themselves, in order to make it easier for them to retrieve
data from the hard-drive of a member of the public, should they wish to do so. 
The
DoD also recommends deguassing the storage medium with a permanent magnet to
saturate it with a unidirectional field. Certain magnetic remnance persists. 
There is
no "zeroisation."
 
My laptop is the peak of a wave.

Reply via email to