Hal Ruhl wrote on 17/1/05:
Do you really mean that your "theory" would made you say no to a doctor presenting you an artificial brain (even with a very low substitution level description of yourself) ?
First assume that choice is available to sub components of a world state.
I would not accept because even if the dynamic is such that my world state sequence suffers only minor shifts such as jumping to slightly different machines I do not believe there is a current description of me low enough that the artificial brain would not lead to a divergence of my future history from what it would have been with my current biological brain. [The dynamic can eventually change my description on the fly in any event.] I would be selecting one future history vs another. Just having the procedure or not is such a selection [choice] [my current brain would suffer some alternate future history as well] and demonstrates that the two courses are not the same.
Having no way to select between these future histories I would stay the course with what I had.
You are saying that accepting the choice of an artificial brain would change your future history as compared with continuing with your current brain.
1. Could we not in theory make the artificial brain arbitrarily close to the original, so that any ensuing change in the future would be no greater than chaos theory predicts would happen with mundane events anyway, such as the decision whether to scratch your head or not in the next minute?
2. Might not the new brain actually improve your future prospects, by being more durable than the original, for example?
3. Do you in any case agree that it might not be possible to tell there has has been a transition from natural to artificial brain, if the copy is functionally sufficiently close to the original?
Credit Card. Apply online. 60 sec response. Must be over 18. AU only: http://ad.au.doubleclick.net/clk;11046970;10638934;f?http://www.anz.com/aus/promo/first1004ninemsn/default.asp