That's getting reasonably close, assuming you don't require the model to have 
any specific degree of fidelity -- there's a difference between being 
conscious of something and understanding it. 

The key is that we judge the consciousness of an entity based on the ability 
of its processes and datastructures to duplicate those abilities and 
reactions we see in ourselves and others as part of being conscious. 

I have a flashbulb memory of a flooded basement when I was 3. It includes the 
geometric arrangement of the stairs, back door of the house, a point of view 
at the top of the stairs, and the fact that there was deep water in the 
basement. That's it -- no idea what color the walls were, whether anyone said 
anything, etc. And no other memories at all before age 4.  I'd have to claim 
I was conscious then, and presumably much of the rest of the time at that 
age, because I was obviously parsing the world into a coherent account and 
would have been capable of short-term memories in that "language".

If you talk to the average person, especially "why did you do that?" kind of 
questions, it's amazing how much of what they say is confabulation and 
rationalization. To me that's evidence that they're *not* as conscious as 
they think they are -- and that their self-models, which they consult to 
answer such questions, are only loosely coupled to their actual mind 
mechanisms.

That in turn gives me to believe that we can see the limits of the illusion 
consciousness is giving us, and thus look under the hood, similar to the way 
we can understand more about the visual process by studying optical 
illusions. 

Josh

On Monday 02 June 2008 01:55:32 am, Jiri Jelinek wrote:
> > On Sun, Jun 1, 2008 at 6:28 PM, J Storrs Hall, PhD <[EMAIL PROTECTED]> 
wrote:
> > Why do I believe anyone besides me is conscious? Because they are made of
> > meat? No, it's because they claim to be conscious, and answer questions 
about
> > their consciousness the same way I would, given my own conscious
> > experience -- and they have the same capabilities
> 
> Would you agree that they are conscious of X when they demonstrate the
> ability to build mental models that include an abstract X concept that
> (at least to some degree) corresponds (and is intended to correspond)
> to the "real world" representation/capabilities of X?
> In the case of self-consciousness, the X would simply = self.
> 
> Regards,
> Jiri Jelinek
> 


-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=103754539-40ed26
Powered by Listbox: http://www.listbox.com

Reply via email to