Then if you take your theory seriously enough you will be lead to
Chalmers or Penrose sort of theory which needs actual non Turing
emulable stuff to make singular your experience or even "local nature".
Why not? I find this a bit speculative, and I am interested more in the
consequence of the old idea that the soul is a number which moves
itself, which easily follow from the computationalist hypothesis, I
think. See my url for more if you are interested, but I think you will
acknowledge we are working with different, and even incompatible
hypothesis. I have no problem with that.
With the FOR book, it is a bit different, I argue indeed that once you
are a number, your neighborhood is *necessarily* given by let us say
some series of numbers.
Like quantum superposition, immateriality would be contagious. (To be
Le 20-juil.-06, à 02:16, David Nyman a écrit : <on the FOR list>
> Don't know if anyone is still watching this thread, which I've just
> browsed with interest. For what it's worth, I don't believe we
> experience 'emotion', or anything else for that matter, in virtue of
> the attribution of 'information processing' to certain aspects of
> brain function. 'Information processing' is a metaphor projected on
> to highly restricted aspects of the overall behaviour of physical
> objects. Trivially, any behaviour of any object whatsoever can be
> described in terms of 'information processing'. By contrast, I take
> specific experiential structures to be robustly isomorphic with a
> unique physical constitution, howsoever this arrangement may be
> described externally in 'informational' terms.
> Computers also are physical objects and hence the question of whether
> they experience emotions or other conscious states must be referred
> empirically to their physical structure and behaviour in itself, not
> as projected into information processing terms. It must be recalled
> that what we choose to term a 'computer program' is merely an
> abstraction of a restricted set of aspects of the computer's physical
> behaviour under certain conditions. This abstraction is not, other
> than metaphorically, what is instantiated in an operational computer;
> rather what is instantiated is a set of physical behaviours. It is
> these behaviours - modulations of the physical substrate - that
> constitute the computer's experiential field, if any. Consequently it
> becomes a matter of empirical investigation to elucidate which aspects
> of the physical structure and behavior of computers, or brains,
> ultimately produce relevant experiential states. Abstract
> 'informational' model building by itself simply creates misleading
> referential paradoxes.
> --- In [EMAIL PROTECTED], "Peter D Jones"
> <[EMAIL PROTECTED]> wrote:
>> --- In [EMAIL PROTECTED], Bruno Marchal <marchal@>
>>> Le 24-avr.-06, à 00:15, Nick Belane a écrit :
>>>> Peter D jones,
>>>> I'm sorry but i can't understand you at all.
>>>> However i think the core of this debate is in Ray's mail.
>>>> He is much more clear than me and you!
>>> I agree with you and ray. Just to prevent misunderstanding, I tend to
>>> use "consciousness" for "phenomenal consciousness", and I use
>>> "cognition" or similar term for the "access one".
>>> Phenomenal consciousness is not third person sharable, and
>>> the most typical first person notion, I would say.
>> But you treat non-communicability as constituting phenomenality
>> (your phenomena are cognitive in every way except being communicable).
>> For most people, being incommunicable is merely a sympton of
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to firstname.lastname@example.org
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at