> From: Ben Goertzel [mailto:[EMAIL PROTECTED]
> 
> If by "conscious" you mean "having a humanlike subjective experience",
> I suppose that in future we will infer this about intelligent agents
> via a combination of observation of their behavior, and inspection of
> their internal construction and dynamics.
> 

An "N"-like subjective experience where N is human, animal, bug, space
alien, god. I don't know if there needs to be an "I" since you could have a
distributed, decentralized "I" or other forms.

> But in future, there could be impostor agents that act like they have
> humanlike subjective experience but don't ... and we could uncover
> them by analyzing their internals...
> 

If the imposters are good enough then they would be the same from a
functional perspective. And if they were the same eventually they would
improve their various attributes until we appeared as zombies to them and
they as godlike to us even though they were just imitating our
consciousness.

> This is under the assumption that subjective experience of an agent is
> correlated with (though not identical with) the patterns in the
> physical system serving as the substrate of that agent ... and that
> external behaviors only constitute a subset of these patterns...
> 

Yes unless there is that complexity layer disconnect...


John




-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=103754539-40ed26
Powered by Listbox: http://www.listbox.com

Reply via email to