On 14 May 2015 at 00:29, Bruno Marchal <marc...@ulb.ac.be> wrote:

[Jason]
> Chalmer's fading quailia argument shows that if replacing a biological
> neuron with a functionally equivalent silicon neuron changed conscious
> perception, then it would lead to an absurdity, either:
> 1. quaila fade/change as silicon neurons gradually replace the biological
> ones, leading to a case where the quaila are being completely out of touch
> with the functional state of the brain.
> or
> 2. the replacement eventually leads to a sudden and complete loss of all
> quaila, but this suggests a single neuron, or even a few molecules of that
> neuron, when substituted, somehow completely determine the presence of
> quaila
>
> His argument is convincing, but what happens when we replace neurons not
> with functionally identical ones, but with neurons that fire according to a
> RNG. In all but 1 case, the random firings of the neurons will result in
> completely different behaviors, but what about that 1 (immensely rare) case
> where the random neuron firings (by chance) equal the firing patterns of the
> substituted neurons.
>
> In this case, behavior as observed from the outside is identical. Brain
> patterns and activity are similar, but according to computationalism the
> consciousness is different, or perhaps a zombie (if all neurons are replaced
> with random firing neurons). Presume that the activity of neurons in the
> visual cortex is required for visual quaila, and that all neurons in the
> visual cortex are replaced with random firing neurons, which by chance,
> mimic the behavior of neurons when viewing an apple.
>
> Is this not an example of fading quaila, or quaila desynchronized from the
> brain state? Would this person feel that they are blind, or lack visual
> quaila, all the while not being able to express their deficiency? I used to
> think when Searle argued this exact same thing would occur when substituted
> functionally identical biological neurons with artificial neurons that it
> was completely ridiculous, for there would be no room in the functionally
> equivalent brain to support thoughts such as "help! I can't see, I am
> blind!" for the information content in the brain is identical when the
> neurons are functionally identical.
>
> But then how does this reconcile with fading quaila as the result of
> substituting randomly firing neurons? The computations are not the same, so
> presumably the consciousness is not the same. But also, the information
> content does not support knowing/believing/expressing/thinking something is
> wrong. If anything, the information content of this random brain is much
> less, but it seems the result is something where the quaila is out of sync
> with the global state of the brain. Can anyone else where shed some clarity
> on what they think happens, and how to explain it in the rare case of
> luckily working randomly firing neurons, when only partial substitutions of
> the neurons in a brain is performed?
>
[Bruno]
> Nice idea, which leads again to the absurdity to link consciousness to the
> right "physical activity", instead of the abstract computation (at the right
> level).

Yes, all these arguments - MGA, Maudlin, Putnam's rock - converge on
the idea that consciousness cannot be dependent on physical activity.

> Only one problem, to use "Chalmers' strategy", you need to change a neuron
> one at a time, but then a little change will quickly spread abnormal
> behavior in the other neurons (which do not yet fire randomly). So you have
> to change all neurons at once, in this case. This might at first mean going
> from consciousness to 0 consciousness, except that we already know (by MGA,
> normally) that consciousness is just not associated to *any* physical
> activity, not even computations.
>
> In fact the people that we can see are sort of p-zombies, in some sense, but
> this is because we see only the 3p-body, and the 3-p bodies are not
> conscious: they are only "pointer" to the person, which is in Platonia, and
> is conscious, in Platonia. (Note that this mean that we are, in some sense,
> in Platonia, at the limit of all computations).

I think of it like 3 physical objects implementing the number 3: the
number 3 was there already.

> I am aware that this is counter-intuitive, but not much than general
> relativity or QM.
>
> Bruno
>
>
>
> Jason
>
> --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to everything-list+unsubscr...@googlegroups.com.
> To post to this group, send email to everything-list@googlegroups.com.
> Visit this group at http://groups.google.com/group/everything-list.
> For more options, visit https://groups.google.com/d/optout.
>
>
> http://iridia.ulb.ac.be/~marchal/
>
>
>
> --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to everything-list+unsubscr...@googlegroups.com.
> To post to this group, send email to everything-list@googlegroups.com.
> Visit this group at http://groups.google.com/group/everything-list.
> For more options, visit https://groups.google.com/d/optout.



-- 
Stathis Papaioannou

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to