On 18 May 2015 at 16:04, Jason Resch <[email protected]> wrote:
>
>
> On Sat, May 16, 2015 at 12:10 AM, Stathis Papaioannou <[email protected]>
> wrote:
>>
>>
>>
>>
>>
>> On 13 May 2015, at 11:59 am, Jason Resch <[email protected]> wrote:
>>
>> Chalmer's fading quailia argument shows that if replacing a biological
>> neuron with a functionally equivalent silicon neuron changed conscious
>> perception, then it would lead to an absurdity, either:
>> 1. quaila fade/change as silicon neurons gradually replace the biological
>> ones, leading to a case where the quaila are being completely out of touch
>> with the functional state of the brain.
>> or
>> 2. the replacement eventually leads to a sudden and complete loss of all
>> quaila, but this suggests a single neuron, or even a few molecules of that
>> neuron, when substituted, somehow completely determine the presence of
>> quaila
>>
>> His argument is convincing, but what happens when we replace neurons not
>> with functionally identical ones, but with neurons that fire according to a
>> RNG. In all but 1 case, the random firings of the neurons will result in
>> completely different behaviors, but what about that 1 (immensely rare) case
>> where the random neuron firings (by chance) equal the firing patterns of the
>> substituted neurons.
>>
>> In this case, behavior as observed from the outside is identical. Brain
>> patterns and activity are similar, but according to computationalism the
>> consciousness is different, or perhaps a zombie (if all neurons are replaced
>> with random firing neurons). Presume that the activity of neurons in the
>> visual cortex is required for visual quaila, and that all neurons in the
>> visual cortex are replaced with random firing neurons, which by chance,
>> mimic the behavior of neurons when viewing an apple.
>>
>> Is this not an example of fading quaila, or quaila desynchronized from the
>> brain state? Would this person feel that they are blind, or lack visual
>> quaila, all the while not being able to express their deficiency? I used to
>> think when Searle argued this exact same thing would occur when substituted
>> functionally identical biological neurons with artificial neurons that it
>> was completely ridiculous, for there would be no room in the functionally
>> equivalent brain to support thoughts such as "help! I can't see, I am
>> blind!" for the information content in the brain is identical when the
>> neurons are functionally identical.
>>
>> But then how does this reconcile with fading quaila as the result of
>> substituting randomly firing neurons? The computations are not the same, so
>> presumably the consciousness is not the same. But also, the information
>> content does not support knowing/believing/expressing/thinking something is
>> wrong. If anything, the information content of this random brain is much
>> less, but it seems the result is something where the quaila is out of sync
>> with the global state of the brain. Can anyone else where shed some clarity
>> on what they think happens, and how to explain it in the rare case of
>> luckily working randomly firing neurons, when only partial substitutions of
>> the neurons in a brain is performed?
>>
>>
>> So Jason, are you still convinced that the random neurons would not be
>> conscious?
>
>
> I believe that is consistent according to computationalism.
>
>>
>> If you are, you are putting the cart before the horse. The fading qualia
>> argument makes the case that any process preserving function also preserves
>> consciousness.
>
>
> But that argument only stands (I think) in the case of functional
> equivalence (at some level). In this case the functions were defined to not
> be equivalent, well in terms of input and output they are equivalent, but
> the computational implementation is definitely different. Perhaps this is
> just a case where the equivalency between functionalism and computationalism
> breaks down (if by functionalism, we count only inputs and outputs).

It is the function that be be equivalent, not the computations;
otherwise, it's like insisting that only meat can think.


-- 
Stathis Papaioannou

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to