On 1/17/2013 9:25 PM, Stephen P. King wrote:
On 1/17/2013 4:21 PM, Russell Standish wrote:
From just the abstract alone, I can't see how this differs from the
Solomonff universal prior?
OK, is that a good thing? It seems to me that it is. Are you
saying that the content of the paper is trivial?
Did you see the part about how it is "that it is not computable and
thus can only be approximated in practice."
On Wed, Jan 16, 2013 at 07:29:33PM -0500, Stephen P. King wrote:
Dear Bruno and Friends,
The paper that I have been waiting a long time for. ;-)
John C. Baez
(Submitted on 11 Oct 2010)
Algorithmic entropy can be seen as a special case of entropy as
studied in statistical mechanics. This viewpoint allows us to apply
many techniques developed for use in thermodynamics to the subject
of algorithmic information theory. In particular, suppose we fix a
universal prefix-free Turing machine and let X be the set of
programs that halt for this machine. Then we can regard X as a set
of 'microstates', and treat any function on X as an 'observable'.
For any collection of observables, we can study the Gibbs ensemble
that maximizes entropy subject to constraints on expected values of
these observables. We illustrate this by taking the log runtime,
length, and output of a program as observables analogous to the
energy E, volume V and number of molecules N in a container of gas.
The conjugate variables of these observables allow us to define
quantities which we call the 'algorithmic temperature' T,
'algorithmic pressure' P and algorithmic potential' mu, since they
are analogous to the temperature, pressure and chemical potential.
We derive an analogue of the fundamental thermodynamic relation
T dS - P d V + mu dN, and use it to study thermodynamic cycles
analogous to those for heat engines. We also investigate the values
of T, P and mu for which the partition function converges. At some
points on the boundary of this domain of convergence, the partition
function becomes uncomputable. Indeed, at these points the
function itself has nontrivial algorithmic entropy.
Now to discuss how this is useful to define a local notion of a
measure for COMP.
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to email@example.com.
To unsubscribe from this group, send email to
For more options, visit this group at