on 6/11/00 1:50 am, [EMAIL PROTECTED] at [EMAIL PROTECTED] wrote:
> The problem may not be in modelling the hardware or software of brains that
> live in bodies. it may be that you need a body to have consciousness and that
> body may have to function in certain ways. I am not certain how strongly I
> feel about this. Until I read D' Amasso's book I was convinced that machine
> consciousness was inevitable and it may still be but it may be very different
> than ours. It may be a lot more than simply having to converse with your
> children in a different language.
>
This is a point that is addressed by the Turing test as originally described
by Alan Turing in his paper about the 'imitation game'[1]. Every description
of this test I have seen in pop-sci or even in computer science texts misses
some important subtle points about what Turing actually described.
In the test the ai is *pretending* to be a human. It has to be smart enough,
despite being a very different type of entity, to fool the questioners
sufficiently well that they cannot tell the ai from the real humans better
than chance. It has to do this by acting, by imagining plausible replies to
questions outside its realm of experience and so on. This is a subtler (and
harder) test than the one commonly presented as the 'Turing Test'.
[1] Which I read in some neat collection of seminal computer science texts
which also contained 'GOTO considered harmful'.
--
William T Goodall
[EMAIL PROTECTED]
http://www.wtgab.demon.co.uk