On 15 May 2015, at 01:48, Jason Resch wrote:



On Wed, May 13, 2015 at 9:29 AM, Bruno Marchal <marc...@ulb.ac.be> wrote:

On 13 May 2015, at 03:59, Jason Resch wrote:

Chalmer's fading quailia argument shows that if replacing a biological neuron with a functionally equivalent silicon neuron changed conscious perception, then it would lead to an absurdity, either: 1. quaila fade/change as silicon neurons gradually replace the biological ones, leading to a case where the quaila are being completely out of touch with the functional state of the brain.
or
2. the replacement eventually leads to a sudden and complete loss of all quaila, but this suggests a single neuron, or even a few molecules of that neuron, when substituted, somehow completely determine the presence of quaila

His argument is convincing, but what happens when we replace neurons not with functionally identical ones, but with neurons that fire according to a RNG. In all but 1 case, the random firings of the neurons will result in completely different behaviors, but what about that 1 (immensely rare) case where the random neuron firings (by chance) equal the firing patterns of the substituted neurons.

In this case, behavior as observed from the outside is identical. Brain patterns and activity are similar, but according to computationalism the consciousness is different, or perhaps a zombie (if all neurons are replaced with random firing neurons). Presume that the activity of neurons in the visual cortex is required for visual quaila, and that all neurons in the visual cortex are replaced with random firing neurons, which by chance, mimic the behavior of neurons when viewing an apple.

Is this not an example of fading quaila, or quaila desynchronized from the brain state? Would this person feel that they are blind, or lack visual quaila, all the while not being able to express their deficiency? I used to think when Searle argued this exact same thing would occur when substituted functionally identical biological neurons with artificial neurons that it was completely ridiculous, for there would be no room in the functionally equivalent brain to support thoughts such as "help! I can't see, I am blind!" for the information content in the brain is identical when the neurons are functionally identical.

But then how does this reconcile with fading quaila as the result of substituting randomly firing neurons? The computations are not the same, so presumably the consciousness is not the same. But also, the information content does not support knowing/believing/ expressing/thinking something is wrong. If anything, the information content of this random brain is much less, but it seems the result is something where the quaila is out of sync with the global state of the brain. Can anyone else where shed some clarity on what they think happens, and how to explain it in the rare case of luckily working randomly firing neurons, when only partial substitutions of the neurons in a brain is performed?

Nice idea, which leads again to the absurdity to link consciousness to the right "physical activity", instead of the abstract computation (at the right level).

So would such person having fading / diminishing qualia?

The person is in platonia (i.e. distributed on infinitely many true sigma_1 sentences), and survives where it is self-referentially correct relatively to an infinity of computations. The person is not in the running of a computer in front of you, which is part of your (stable) illusion (there is only 0, s(0), s(s(0)), and their add/plus relations). The mystery is in the fact that such illusion looks computable, which would contradict comp (here comp is saved by QM, which show that there is something non computable but observable.





Only one problem, to use "Chalmers' strategy", you need to change a neuron one at a time, but then a little change will quickly spread abnormal behavior in the other neurons (which do not yet fire randomly). So you have to change all neurons at once, in this case.

It is possible in theory, if you're running a simulated brain, and indicate at time T some subset of the neurons stop executing their regular neuron simulation code and instead follow random neuron code.

Let us say that we already know that consciousness is not dependent on the low level implementation.(Below the substitution level, there is an infinite of them).

MGA does show that at some level, attributing the consciousness to the physical activity is like saying that it is not Deep Blue who won the chess tournament, but Z8000, the processor used (supposedly) that day.

If from one o'clock to five o'clock, your neurons run randomly, but that by the ultra-incredible chance you get the right physical activity, well, you, in platonia, are lucky that with some luck, the rigth determinacy comes back, and if it did, at five o'clock, obviously you will not mention any fading qualia.

You will ask me, and what if the determinacy never come back, but the luck continue? Then we have to treat the person like if he was not a zombie, by definition, as, by luck, it incarnates the right counterfactuals by definition, just in advance, so to speak. It is like a very trivial machine still just a body: it never thinks per se, together with an oracle making it acts like the normal computations demand. We have to ascribe consciousness to the person, in platonia, never to its 3p representation relatively to us. It is a confusion between a person and its mask or clothes.





This might at first mean going from consciousness to 0 consciousness, except that we already know (by MGA, normally) that consciousness is just not associated to *any* physical activity, not even computations.

So the vanishingly rare actual computations in math that are random but behaving like zombies (by luck) have no consciousness and we can discount them?

None of consciousness. Consciousness, like matter will be in the infinite sum. The bodies are the "illusion" here. And there are reason to believe that we are related to some random oracle, probably allowing the quantum phase to get rid of the white rabbit, but this needs still some works.



Are there those rare creatures somewhere in Platonia that see while feeling as though they're blind?

No. At the level I think you are, that would be a contradiction with comp.

At a different higher level, you might mean all lucid platonist machines. As reality is not wysiwig for them, seeing can blind you the real, or a part of the real.

Bruno



Jason


In fact the people that we can see are sort of p-zombies, in some sense, but this is because we see only the 3p-body, and the 3-p bodies are not conscious: they are only "pointer" to the person, which is in Platonia, and is conscious, in Platonia. (Note that this mean that we are, in some sense, in Platonia, at the limit of all computations).

I am aware that this is counter-intuitive, but not much than general relativity or QM.




--
You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

http://iridia.ulb.ac.be/~marchal/



--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to