On 2 March 2014 22:18, Bruno Marchal <[email protected] <javascript:;>>
wrote:
>
> On 02 Mar 2014, at 08:09, Stathis Papaioannou wrote:
>
>
>
> On 1 March 2014 01:40, Bruno Marchal <[email protected] <javascript:;>>
wrote:
>
>>> If you start with the assumption that the physics relevant to brain
>>> function is not computable then computationalism is false: it would be
>>> impossible to make a machine that behaves like a human, either zombie
>>> or conscious.
>>
>>
>> I agree with you, the physics *relevant* to brain function has to be
>> computable, for comp to be true. But the point is that below the
>> substitution level, the physical details are not relevant. Then by the
>> FPI,
>> they must be undetermined, and this on an infinite non computable domain,
>> and so, our "computable brain" must rely on a non computable physics, or
a
>> non necessarily computable physics, with some non computable aspect. This
>> is
>> what comp predicts, and of course this is confirmed by QM. Again,
>> eventually, QM might to much computable for comp to be true. That is what
>> remain to be seen.
>>
>>
>>> What I mean by functionalism is that the way the brain processes
>>> information, its I/O behaviour, is what generates mind. This implies
>>> multiple realisability of mental states, insofar as the same
>>> information processing could be done by another machine. If the
>>> machine is a digital computer then functionalism reduces to
>>> computationalism. If the brain utilises non-computable physics then
>>> you won't be able to reproduce its function (and the mind thus
>>> generated) with a digital computer, so computationalism is false.
>>> However, that does not necessarily mean that functionalism is false,
>>> since you may be able to implement the appropriate brain function
>>> through some other means. For example, if it turns out that a digital
>>> implementation of the brain fails because real numbers and not
>>> approximations are necessary, it may still be possible to implement a
>>> brain using analogue devices.
>>
>>
>> OK, but that functionalism seems to me trivially true. How could such
>> functionalism be refuted, if you can invoke arbitrary functions?
>> (Also, "functionalism" is used for a stringer (less general) version of
>> computationalism, by Putnam, so this use of functionalism is non standard
>> and can be confusing.
>> Last remark, I am not sure that the notion of information processing can
>> make sense in a non digital framework. In both quantum and classical
>> information theory, information is digital (words like bits and qubits
>> come
>> from there).
>
> I think functionalism is true, but it's not obviously true, at least to
most
> people. It could be that the observable behaviour of the brain is
reproduced
> perfectly but the resulting creature has no consciousness or a different
> conscious.
>
>
> What if someone says that the function of the brain is to provide
> consciousness. Is that functionalism?
> What if someone says that the function of the brain is to link a "divine"
> soul to a person through a body?
> What is a function?

No, a function is an observable pattern of behaviour. Functionalism says
that if you replicate this, you also replicate the mind. You need to
replicate not only a special behaviour (which could be quite easy) but all
outputs for all inputs.

> That would be the case if consciousness were substrate-dependent.
>
>
> But you can put the substrate in the function. A brain would have the
> function to associate to that substrate the experience. How could I say no
> to the doctor who guaranties that all the function of the brain are
> preserved.
> The term function, like set, is too general, to much powerful.

Then I'm using it in a somewhat precise sense as above.

> It could also be that the behaviour cannot be reproduced by a computer
> because the substitution level requires non-computable physics (true
> randomness, real numbers, non-computable functions), but it could be
> reproduced by a non-computational device. So there are these possibilities
> with brain replacement:
>
> (a) the behaviour is not reproduced and neither is the consciousness;
>
>
> = ~ BEH-MEC
>
>
> (b) The behaviour is reproduced but the consciousness is not reproduced;
>
>
> ~ comp.
>
>
>
> (c) The behaviour is reproduced and so is the consciousness;
>
>
> = comp, unless it is the consciousness is the one by an impostor.
>
>
>
> (d) The behaviour is not reproduced but the consciousness is
>
>
> = "bad" substitution.
>
>
>
>
>>> What can be proved is that if consciousness is due to the brain then
>>> replicating brain function in some other substrate will also replicate
>>> its consciousness.
>>
>>
>> OK. What I meant is that we cannot prove that consciousness is due to the
>> brain.
>
> Yes, a dualist, for example, could consistently deny fuctionalism,
>
>
> Not sure. It depends on how you define function.
>
>
>
> but someone who believes that consciousness is due to the brain could not.
>
>
> Most dualist believes that consciousness is due to the brain. They will
> usually deny that the functional relation can be obtained with this or
that
> type of functions, but to throw out all functions, makes their theory
> spurious. There will lost both interactionism and epiphenomenalism.
>
> Maybe you are on some idea, but you should take another word, as in
> philosophy of mind, functionalism is used for Putnam's computationalism. I
> do see vaguely what you mean, but it is still hard to define this
precisely.
> It corresponds to the infinities of weakening comp, by adding oracle, or
> things like that.
> My old definition of comp was En (functionalism is true at level n),
making
> functionalism à-la Putnam a restricted from of comp. Your terming is non
> standard and can lead to confusion, imo. You might use "mechanism"
instead,
> with the idea that not all machines are digital indeed.

I like this account of functionalism from the International Encyclopedia of
Philosophy:

quote>
Consider, for example, mouse traps. Mouse traps are devices for catching or
killing mice. Mouse traps can be made of most any material, and perhaps
indefinitely or infinitely many designs could be employed. The most
familiar sort involves a wooden platform and a metal strike bar that is
driven by a coiled metal spring and can be released by a trigger. But there
are mouse traps designed with adhesives, boxes, poisons, and so on. All
that matters to something’s being a mouse trap, at the end of the day, is
that it is capable of catching or killing mice.

Contrast mouse traps with diamonds. Diamonds are valued for their hardness,
their optical properties, and their rarity in nature. But not every hard,
transparent, white, rare crystal is a diamond—the most infamous alternative
being cubic zirconia. Diamonds are carbon crystals with specific molecular
lattice structures. Being a diamond is a matter of being a certain kind of
physical stuff. (That cubic zirconia is not quite as clear or hard as
diamonds explains something about why it is not equally valued. But even if
it were equally hard and equally clear, a CZ crystal would not thereby be a
diamond.)

These examples can be used to explain the core idea of functionalism.
Functionalism is the theory that mental states are more like mouse traps
than they are like diamonds. That is, what makes something a mental state
is more a matter of what it does, not what it is made of. This
distinguishes functionalism from traditional mind-body dualism, such as
that of René Descartes, according to which minds are made of a special kind
of substance, the res cogitans (the thinking substance.) It also
distinguishes functionalism from contemporary monisms such as J. J. C.
Smart’s mind-brain identity theory. The identity theory says that mental
states are particular kinds of biological states—namely, states of
brains—and so presumably have to be made of certain kinds of stuff, namely,
brain stuff. Mental states, according to the identity theory, are more like
diamonds than like mouse traps. Functionalism is also distinguished from B.
F. Skinner’s behaviorism because it accepts the reality of internal mental
states, rather than simply attributing psychological states to the whole
organism. According to behaviorism, which mental states a creature has
depends just on how it behaves (or is disposed to behave) in response to
stimuli. In contrast functionalists typically believe that internal and
psychological states can be distinguished with a “finer grain” than
behavior—that is, distinct internal or psychological states could result in
the same behaviors. So functionalists think that it is what the internal
states do that makes them mental states, not just what is done by the
creature of which they are parts.
<end quote


--
Stathis Papaioannou


-- 
Stathis Papaioannou

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to