On 9/20/2013 4:40 PM, Chris de Morsella wrote:
Current software is very energy efficient -- and on so many levels. I worked developing
code used in the Windows Smartphone and it was during that time that I had to first
think hard about the energy efficiency dimension in computing -- as measured by useful
work done per unit of energy. The engineering management in that group was constantly
harping on the need to produce energy efficient code.
Programmers are deeply engrained with a lot of bad habits -- and not only in terms of
producing energy efficient software. For example most developers will instinctively grab
large chunks of resources -- in order to ensure that their processes are not starved of
resources in some kind of peak scenario. While this may be good for the application --
when measured by itself -- it is bad for the overall footprint of the application on the
device (bloat) and for the energy requirements that that software will impose on the
hardware. Another example of a common bad practice poorly written synchronization code
(or synchronized containers).
These bad practices (anti-patterns in the jargon) can not only have a huge impact on
performance in peak usage scenarios, but also act to increase the energy requirements
for that software to run.
I think that -- with a lot of programming effort of course (which is why it will never
happen) that the current code base, and not only in the mobile small device space, where
it is clearly important, but in datacenter scale applications and service (exposed)
applications as well -- that the energy efficiency of software has a huge headroom for
improvement. But in order for this to happen there has to first be a profound cultural
change amongst software developers who are being driven by speed to market, and other
draconian economic and marketing imperatives and are producing code under these types od
deadlines and constraints.
There's a lot of bad design in consumer electronics, particularly in user interfaces,
because the pressure is to get more and newer features and apps. Eventually (maybe
already) this will slow down and designers will start to pay more attention to refining
the stuff already there.
If there is a theoretical minimum that derives from the second law of thermodynamics it
must be exceedingly far below what the current practical minimums are for actual real
world computing systems. And I do not see how a minimum can be determined without
reference to the physical medium in which the computing system being measured is
It is determined by the temperature of the environment in which entropy must be dumped in
order to execute irreversible operations (like erasing a bit). But you're right that
current practicle minimums are very far above the Landauer limit and so it has not effect
on current design practice. The current practice is limited by heat dissipation and
In fact how could a switch be implemented without it being implemented in some medium
that contains the switch?
The way to completely avoid Landauer's limit is to make all operations reversible, never
lose any information so that the whole calculation could be reversed. Then there's no
entropy dumped to the environment and Landauer's limit doesn't apply.
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email
To post to this group, send email to email@example.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/groups/opt_out.