On Fri, Sep 20, 2013 at 4:22 PM, Chris de Morsella <cdemorse...@yahoo.com>wrote:
>> A computation always takes a nonzero amount of energy to perform,
>> theoretically you can make the energy used be as close to zero as you like,
>> but the less energy you use the slower the calculation.
> > How does that square with the increased (well measured) energy
> efficiency per fundamental unit of logic (single machine operation) -- it
> takes far less energy to perform an elementary logic operation on a modern
> CPU than it did on say a CPU from ten years ago
I'm talking about the theoretical limit dictated by the laws of physics,
right now we are nowhere near that and technological factors are
astronomically more important. According to Landauer's principle the
minimum energy to change one bit of information is, in joules, kT*ln2 where
k is Boltzmann's constant and T is the temperature in degrees kelvin of the
object doing the computation. A joule is a very small amount of energy, one
watt hour is equal to 3600 joules, and Boltzmann's constant is a very very
small number, about 10^-23, so it will be some time before we have to start
thinking seriously about ways to overcome this theoretical limit with
something like reversible computing.
John K Clark
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email
To post to this group, send email to email@example.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/groups/opt_out.