On Mon, May 15, 2006 at 11:17:35AM +0200, Bruno Marchal wrote: > > > Le 15-mai-06, à 02:04, Russell Standish a écrit : > > >> > >> I guess it is a delicate point, a key point though, which overlaps the > >> ASSA/RSSA distinction (that is: the Absolute Self Sampling Assumption > >> versus the Relative Self Sampling Assumption). > >> > >> If you identify a "conscious first person history" with a "third > >> person > >> describable computation", it can be argued that an explanation for > >> physics can be given by Bayesian sort of anthropic reasoning based on > >> some universal probability distribution like Hall Finney's > >> Kolmogorovian UDist. Note tat this approach relies also on Church > >> Thesis. Here somehow the TOE will be a winning little program. I agree > >> that this would hunt away the third person white rabbits. > > > > I disagree. The UDist comes from looking at the measure induced on a > > set of descriptions > > OK. > > > > (or computations if your prefer, > > It is not the same. It changes the whole problem, especially from the > Relative SSA (Self-sampling assumption). > > > > > although the two > > are not equivalent), > > > OK, why not taking that difference into account. I think it is a > crucial point.

## Advertising

I do :). However, its makes no difference as far as I can tell to the Occam's razor issue. > > > > > given a reference Turing machine U. This appears > > to be a 3rd person description, but it need not be so. > > > I am not sure I understand. > Do you mean you don't think it is a 3rd person description, or do you mean you don't think it can be anything else? > > > > As I have > > pointed out (but suspect it hasn't really sunk in yet), U can be > > taken to be the observer erself. > > > I could agree, but U cannot *know* e is U. Need some bet or act of > faith. > In general if U describes the observer, he is a "big" number in need of > an explanation. I mean, the existence of "big stable U" is what we try > to explain. > Sure. In fact the argument does not even hinge upon the observer being a UTM. However, if not, then the distribution is not exactly universal (hence my quotes below). > > > When done this way, there is a 1st > > person "universal distribution", with a corresponding 1st person Occam > > razor theorem. And this implies the absence of 1st person white > > rabbits. > > > I really don't understand. > > Bruno > The details, of course are in my paper "Why Occams Razor". To summarise, an observer induces a map O(x) from the space of descriptions, which is equivalent AFAIK to the output of your UD, to the space of meanings. For any given meaning y, let omega(y,l) be the number of equivalent descriptions of length l mapping to y (for infinite length description we need the length l prefixes). So omega(y,l) = |{x: O(x)=y & len(x)=l}| Now P(y) = lim_{l->\infty} omega(y,l)/2^l is a probability distribution, related to the Solomonoff-Levin universal distribution. C(y)=-log_2 P(y) is a complexity measure related to Kolmogorov Complexity. Basically this is an Occams Razor theorem - the probability of observing something decreases dramatically with its observed complexity. And this is a pure 1st person result. It doesn't get rid of all white rabbits, but the remaining ones are dealt with the Malcolm-Standish argument. > > http://iridia.ulb.ac.be/~marchal/ > > > -- ---------------------------------------------------------------------------- A/Prof Russell Standish Phone 8308 3119 (mobile) Mathematics 0425 253119 (") UNSW SYDNEY 2052 [EMAIL PROTECTED] Australia http://parallel.hpc.unsw.edu.au/rks International prefix +612, Interstate prefix 02 ---------------------------------------------------------------------------- --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Everything List" group. To post to this group, send email to everything-list@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/everything-list -~----------~----~----~----~------~----~------~--~---