Reversible computing seems like a fascinating possibility, but it is pretty far off. even if economically feasible and mass producible reversible physical logic gates and chip architectures were to be discovered today, the inertia of the existing code base would take many decades to work its way through the life cycle. Attempts to promote the parallelization of algorithms also face this legacy problem as well.
But by the time (if ever) a reversible set of the basic logic gates AND, NAND, OR, XOR are discovered - perhaps algorithms will have become so sophisticated that an existing legacy code base could be run through the various analyzers etc. and the "intent" of the code could be discovered by an automatic self-tending process that could then use this map as a template in order to perform code generation of equivalent user facing functionality - and so is an essentially seamless experience for the user - but that has been radically re-architected, re-factored & recompiled into code that works with a reversible architecture. A similar strategy could be used for achieving the maximum feasible parallelization of algorithms/code - by automatically re-writing the code base.. For quantum computing algorithms (i.e. code) as well. All still some ways off into the future though. just somewhat pie in the sky musings. From: [email protected] [mailto:[email protected]] On Behalf Of John Clark Sent: Friday, September 20, 2013 8:50 PM To: [email protected] Subject: Re: What gives philosophers a bad name? On Fri, Sep 20, 2013 at 4:22 PM, Chris de Morsella <[email protected]> wrote: >> A computation always takes a nonzero amount of energy to perform, theoretically you can make the energy used be as close to zero as you like, but the less energy you use the slower the calculation. > How does that square with the increased (well measured) energy efficiency per fundamental unit of logic (single machine operation) -- it takes far less energy to perform an elementary logic operation on a modern CPU than it did on say a CPU from ten years ago I'm talking about the theoretical limit dictated by the laws of physics, right now we are nowhere near that and technological factors are astronomically more important. According to Landauer's principle the minimum energy to change one bit of information is, in joules, kT*ln2 where k is Boltzmann's constant and T is the temperature in degrees kelvin of the object doing the computation. A joule is a very small amount of energy, one watt hour is equal to 3600 joules, and Boltzmann's constant is a very very small number, about 10^-23, so it will be some time before we have to start thinking seriously about ways to overcome this theoretical limit with something like reversible computing. John K Clark -- You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To post to this group, send email to [email protected]. Visit this group at http://groups.google.com/group/everything-list. For more options, visit https://groups.google.com/groups/opt_out. -- You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To post to this group, send email to [email protected]. Visit this group at http://groups.google.com/group/everything-list. For more options, visit https://groups.google.com/groups/opt_out.

