On 06/08/2014 10:20 PM, Steven D'Aprano wrote:
> A typical desktop computer uses less than 500 watts for *everything* 
> except the screen. Hard drives. DVD burner. Keyboard, mouse, USB devices, 
> network card, sound card, graphics card, etc. (Actually, 350W is more 
> typical.)
> 
> Moore's Law observes that processing power has doubled about every two 
> years. Over the last decade, processing power has increased by a factor 
> of 32. If *efficiency* had increased at the same rate, that 500W power 
> supply in your PC would now be a 15W power supply. Your mobile phone 
> would last a month between recharges, not a day. Your laptop could use a 
> battery half the size and still last two weeks on a full charge.

Actually that's not what Moore's law is about.  Moore's law states that
the number of transistors on the die doubles every 18 months.  Any other
doubling of something else is entirely coincidental.

> <snip>
>
> No. I'm arguing that they shouldn't convert 90% of their energy input 
> into heat.

All electronic circuits that don't create a motive force that performs
work convert 100% of their electrical energy into heat. I'm using "work"
defined in the physics sense.  CPUs take in electricity and expire 100%
of it as heat, and do so immediately.  This conversion to heat does
happen to do something useful along the way (flipping states on
transistors that represent information).  We used to tell people that
computers make very efficient space heaters.  Because in fact they do.
-- 
https://mail.python.org/mailman/listinfo/python-list

Reply via email to