On Sun, 08 Jun 2014 23:32:33 -0700, Rustom Mody wrote:
> On Monday, June 9, 2014 9:50:38 AM UTC+5:30, Steven D'Aprano wrote:
>> On Sun, 08 Jun 2014 19:24:52 -0700, Rustom Mody wrote:
>> > On Monday, June 9, 2014 7:14:24 AM UTC+5:30, Steven D'Aprano wrote:
>> >> CPU technology is the triumph of brute force over finesse.
>> > If you are arguing that computers should not use millions/billions of
>> > transistors, I wont argue, since I dont know the technology.
>> No. I'm arguing that they shouldn't convert 90% of their energy input
>> into heat.
> Strange statement.
> What should they convert it into then?
Useful work, duh.
Everything *eventually* gets converted to heat, but not immediately.
There's a big difference between a car that gets 100 miles to the gallon,
and one that gets 1 mile to the gallon. Likewise CPUs should get more
"processing units" (however you measure them) per watt of electricity
See, for example:
Theoretically, room‑temperature computer memory operating
at the Landauer limit could be changed at a rate of one
billion bits per second with only 2.85 trillionths of a
watt of power being expended in the memory media. Modern
computers use millions of times as much energy.
Much to my surprise, Wikipedia says that efficiency gains have actually
been *faster* than Moore's Law. This surprises me, but it makes sense: if
a CPU uses ten times more power to perform one hundred times more
computations, it has become much more efficient but still needs a much
bigger heat sink.
> JFTR: Information processing and (physics) energy are about as
> convertible as say: "Is a kilogram smaller/greater than a mile?"
(1) I'm not comparing incompatible units. And (2) there is a fundamental
link between energy and entropy, and entropy is the reverse of
information. See Landauer's Principle, linked above. So information
processing and energy are as intimately linked as (say) current and
voltage, or mass and energy, or momentum and position.