On 14 Dec 2010, at 20:24, Brent Meeker wrote:

On 12/14/2010 7:30 AM, Jason Resch wrote:


I think the path to seeing the mind as a program is easier in this way: 1. It's not what the parts of the brain are made of its how they function which determines behavior 2. This leads to the idea of multiple realizability http://en.wikipedia.org/wiki/Multiple_realizability (Brains can be made in different ways so long as the parts function the same) 3. Accordingly, one could replace each neuron, or each atom, (or whatever) with a device that behaved like what it was replacing (A man made out of antimatter and antiparticles would still be a man) 4. Philosophical zombies ( http://en.wikipedia.org/wiki/Philosophical_zombie ) are not possible, their brain/mind would have all the same beliefs, and all the same information as the equivalently organized and behaving brain it replaced, but in what sense could one say this one's beliefs are wrong but this one's beliefs are right? There would be no way to ever prove that one is conscious and one is not, it would be wrong for no reason at all. This is what it takes for the idea of zombies to be consistent. Further, the real brain and zombie brain could never even report feeling any different, since both brains contain the same information and same knowledge, how is it possible for one to report differences in experience? This addresses your question of whether or not there would be an impact to one's consciousness if their brain were swapped by a device with equivalent processing of information.

I don't disagree with any of the above. But there is a complexity that is passed over. Having information, and being able to equate "the same information", implies that the processes in the brain are about something, something that the differently realized brains can agree on. I think this requires an external world with which they both can interact.

The problem, that is *the* mind-body problem in the mechanist frame, is that a digital mechanism cannot distinguish a local simulation of an external world with an external world. This eventually leads to making external worlds into a statistical sum on all the computations going through our current state.



You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-l...@googlegroups.com.
To unsubscribe from this group, send email to 
For more options, visit this group at 

Reply via email to