2010/1/14 Brent Meeker <meeke...@dslextreme.com>:

>> I think it would be enough for the AI to reproduce the I/O of the
>> whole brain in aggregate. That would involve computing a function
>> controlling each efferent nerve, accepting as data input from the
>> afferent nerves. The behaviour would have to be the same as the brain
>> for all possible inputs, otherwise the AI might fail the Turing test.
>>
>
> To have the same output for all possible inputs is a very strong condition
> and seems to go beyond functionalism.  Suppose (as seems likely) there are
> inputs that "crash" the brain (e.g. induce epileptic seizures).  Would the
> AI brain be less conscious because it didn't experience these seizures?
>  Passing or failing the Turing test is a rather crude measure - after all
> interlocutor might simply guess right.

It would depend on whether the aim was to reproduce a particular
person (which you would want if you were thinking of replacing your
own brain) or just a generic human level intelligence. If we want to
reproduce a particular person the I/O behaviour would be allowed to
vary as much as your behaviour might vary from day to day without
those who know being alarmed. If we want to make a generic AI the
allowed variation could be greater.

>> It's not clear if the modelling would have to be at the molecular,
>> cellular or some higher level in order to achieve this, but in any
>> case I expect that there would be many different programs that could
>> do the job even if the hardware and operating system are kept the
>> same. It could therefore be a case of multiple computations leading to
>> the same experience. Pinning down a thought to a location in time and
>> space would pose no more of a problem for the AI than for the brain.
>>
>
> Then among those AI brains with different computations but the same I/O, you
> would have to find the same OMs constituted by different sequences of
> computational steps.
>
> My intuition is that having the same O for "most" (some very large set of )
> I would be enough to instantiate consciousness - just not the same
> consciousness.  I think there may be different kinds of consciousness, so a
> look-up-table (like Searle's Chinese Room) may be conscious but in a
> different way.

Yes.

-- 
Stathis Papaioannou
-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-l...@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.


Reply via email to