Stathis Papaioannou wrote:
> Peter Jones writes:
> [quoting Russell Standish]
>>>>The Game of Life is known to be Turing complete. However, I do not
>>>>think any arrangement of dots in GoL could be conscious. Rather there
>>>>is an arrangement that implements a universal dovetailer. The UD is
>>>>quite possibly enough to emulate the full Multiverse (this is sort of where
>>>>Bruno's partail results are pointing), which we know contain conscious
> [quoting SP]
>>>That's putting it inversely compared to my (naive) understanding of how the
>>>I would have said
>>>(a) some programs are associated with consciousness
>>>(b) the UD emulates all programs
>>>(c) hence, the UD emulates all the conscious programs
>>>In particular, I would have said that some sequence of frames in GoL is
>>>a particular consciousness that can interact with the universe providing the
>>>its implementation, because we can observe the patterns, maybe even link
>>>them to real
>>That is a strange passage. Are you saying that the links would
>>c) there is no difference between a) and b).
> The links would be causal in the normal sense of the word, i.e. the computer
> running GoL is an
> electronic device following the laws of physics, and we could link its output
> to real world events
> in the usual way that we interface with electronic computers.
>>>This does not necessarily mean that the consciousness is caused by or
>>>supervenes on the pattern of dots, any more that the number 3 is caused by
>>>on a collection of 3 objects. If anything, it could be the other way around:
>>>the GoL pattern
>>>supervenes on, or is isomorphic with, the consciousness which resides in
> Well, this is the whole problem we have been discussing these past few weeks.
> The computer
> exhibits intelligent behaviour and we conclude that it is probably conscious.
> The physical
> states of the computer are clearly the cause of its behaviour, and the means
> whereby we
> can observe it or interact with it, but is it correct to say that the
> physical states are the cause
> of its *consciousness*? At first glance, the answer is "yes". But what about
> a computer which
> goes through exactly the same physical states as part of a recording, as
> discussed in my other
> posts? If you say this is not conscious, you have a problem, because
> identical electrical activity
> in the computer's circuitry would then on one occasion cause consciousness
> and on another
> occasion not. If you say it is conscious, then you have to allow that a
> recording or an inputless
> machine can be conscious, something many computationalists are loathe to do.
> Stathis Papaioannou
I think there are different kinds of consciousness. The kind in question here
to be the inner narrative - a story we tell ourselves about what's happening
we're thinking. If we think about how and why we would put this into a robot I
that provides some clues as to its nature. It's function is to select out
including reasons and feelings, that are significant and should be remembered.
are put into a kind of coherent story (which often involves confabulation) and
to memory. This memory is then experience that can be drawn on in estimating
evaluating future courses of action. So, on this theory, I would say that if
repeated the same hardware events - including starting with the same memory
the second run would be conscious as well as the first.
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to email@example.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at