>From just the abstract alone, I can't see how this differs from the Solomonff universal prior?

## Advertising

Cheers On Wed, Jan 16, 2013 at 07:29:33PM -0500, Stephen P. King wrote: > Dear Bruno and Friends, > > The paper that I have been waiting a long time for. ;-) > > http://arxiv.org/abs/1010.2067 > > > Algorithmic Thermodynamics > > John C. Baez > <http://arxiv.org/find/math-ph,math/1/au:+Baez_J/0/1/0/all/0/1>,Mike > Stay <http://arxiv.org/find/math-ph,math/1/au:+Stay_M/0/1/0/all/0/1> > (Submitted on 11 Oct 2010) > > Algorithmic entropy can be seen as a special case of entropy as > studied in statistical mechanics. This viewpoint allows us to apply > many techniques developed for use in thermodynamics to the subject > of algorithmic information theory. In particular, suppose we fix a > universal prefix-free Turing machine and let X be the set of > programs that halt for this machine. Then we can regard X as a set > of 'microstates', and treat any function on X as an 'observable'. > For any collection of observables, we can study the Gibbs ensemble > that maximizes entropy subject to constraints on expected values of > these observables. We illustrate this by taking the log runtime, > length, and output of a program as observables analogous to the > energy E, volume V and number of molecules N in a container of gas. > The conjugate variables of these observables allow us to define > quantities which we call the 'algorithmic temperature' T, > 'algorithmic pressure' P and algorithmic potential' mu, since they > are analogous to the temperature, pressure and chemical potential. > We derive an analogue of the fundamental thermodynamic relation dE = > T dS - P d V + mu dN, and use it to study thermodynamic cycles > analogous to those for heat engines. We also investigate the values > of T, P and mu for which the partition function converges. At some > points on the boundary of this domain of convergence, the partition > function becomes uncomputable. Indeed, at these points the partition > function itself has nontrivial algorithmic entropy. > > > Now to discuss how this is useful to define a local notion of a > measure for COMP. > > -- > Onward! > > Stephen > > -- > You received this message because you are subscribed to the Google Groups > "Everything List" group. > To post to this group, send email to everything-list@googlegroups.com. > To unsubscribe from this group, send email to > everything-list+unsubscr...@googlegroups.com. > For more options, visit this group at > http://groups.google.com/group/everything-list?hl=en. > -- ---------------------------------------------------------------------------- Prof Russell Standish Phone 0425 253119 (mobile) Principal, High Performance Coders Visiting Professor of Mathematics hpco...@hpcoders.com.au University of New South Wales http://www.hpcoders.com.au ---------------------------------------------------------------------------- -- You received this message because you are subscribed to the Google Groups "Everything List" group. To post to this group, send email to everything-list@googlegroups.com. To unsubscribe from this group, send email to everything-list+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/everything-list?hl=en.