On 19 May 2015 at 11:05, Jason Resch <jasonre...@gmail.com> wrote:
>
>
> On Mon, May 18, 2015 at 10:05 AM, Stathis Papaioannou <stath...@gmail.com>
> wrote:
>>
>>
>>
>> On Tuesday, May 19, 2015, Bruno Marchal <marc...@ulb.ac.be> wrote:
>>>
>>>
>>> On 16 May 2015, at 07:10, Stathis Papaioannou wrote:
>>>
>>>
>>>
>>>
>>>
>>> On 13 May 2015, at 11:59 am, Jason Resch <jasonre...@gmail.com> wrote:
>>>
>>> Chalmer's fading quailia argument shows that if replacing a biological
>>> neuron with a functionally equivalent silicon neuron changed conscious
>>> perception, then it would lead to an absurdity, either:
>>> 1. quaila fade/change as silicon neurons gradually replace the biological
>>> ones, leading to a case where the quaila are being completely out of touch
>>> with the functional state of the brain.
>>> or
>>> 2. the replacement eventually leads to a sudden and complete loss of all
>>> quaila, but this suggests a single neuron, or even a few molecules of that
>>> neuron, when substituted, somehow completely determine the presence of
>>> quaila
>>>
>>> His argument is convincing, but what happens when we replace neurons not
>>> with functionally identical ones, but with neurons that fire according to a
>>> RNG. In all but 1 case, the random firings of the neurons will result in
>>> completely different behaviors, but what about that 1 (immensely rare) case
>>> where the random neuron firings (by chance) equal the firing patterns of the
>>> substituted neurons.
>>>
>>> In this case, behavior as observed from the outside is identical. Brain
>>> patterns and activity are similar, but according to computationalism the
>>> consciousness is different, or perhaps a zombie (if all neurons are replaced
>>> with random firing neurons). Presume that the activity of neurons in the
>>> visual cortex is required for visual quaila, and that all neurons in the
>>> visual cortex are replaced with random firing neurons, which by chance,
>>> mimic the behavior of neurons when viewing an apple.
>>>
>>> Is this not an example of fading quaila, or quaila desynchronized from
>>> the brain state? Would this person feel that they are blind, or lack visual
>>> quaila, all the while not being able to express their deficiency? I used to
>>> think when Searle argued this exact same thing would occur when substituted
>>> functionally identical biological neurons with artificial neurons that it
>>> was completely ridiculous, for there would be no room in the functionally
>>> equivalent brain to support thoughts such as "help! I can't see, I am
>>> blind!" for the information content in the brain is identical when the
>>> neurons are functionally identical.
>>>
>>> But then how does this reconcile with fading quaila as the result of
>>> substituting randomly firing neurons? The computations are not the same, so
>>> presumably the consciousness is not the same. But also, the information
>>> content does not support knowing/believing/expressing/thinking something is
>>> wrong. If anything, the information content of this random brain is much
>>> less, but it seems the result is something where the quaila is out of sync
>>> with the global state of the brain. Can anyone else where shed some clarity
>>> on what they think happens, and how to explain it in the rare case of
>>> luckily working randomly firing neurons, when only partial substitutions of
>>> the neurons in a brain is performed?
>>>
>>>
>>> So Jason, are you still convinced that the random neurons would not be
>>> conscious? If you are, you are putting the cart before the horse. The fading
>>> qualia argument makes the case that any process preserving function also
>>> preserves consciousness. Any process; that computations are one such process
>>> is fortuitous.
>>>
>>>
>>> But the random neurons does not preserve function, nor do the "movie".
>>> OK?
>>
>>
>> I don't see why you're so sure about this. Function is preserved while the
>> randomness corresponds to normal activity, then it all falls apart. If by
>> some miracle it continued then the random brain is as good as a normal
>> brain, and I'd say "yes" to the doctor offering me such a brain. If you
>> don't think that counts as computation, OK - but it would still be
>> conscious.
>>
>
>
> My third-person function would indeed be preserved by such a "Miracle
> Brain", but I would strongly doubt it would preserve my first-person. Why do
> you think that the random firing neurons preserve consciousness? Do you
> think they would still preserve consciousness if they became physically
> separated from each other yet maintained the same firing patterns?

I think the random neurons would preserve consciousness because
otherwise you could make a partial zombie, as you pointed out. There
is nothing incoherent about randomly firing neurons sustaining
consciousness, but there is about partial zombies. If the neurons
became physically separated but maintained the same firing pattern,
including motor neurons, then yes, that would also preserve
consciousness.


-- 
Stathis Papaioannou

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to