> -----Original Message-----
> From: Matt Mahoney via AGI [mailto:[email protected]]
> Subject: Re: [agi] Review paper on measuring consciousness
> 
> On Tue, Sep 23, 2014 at 1:18 AM, Ben Goertzel <[email protected]> wrote:
> >
> > I don't think these issues (regarding qualia and the nature of
> > subjective
> > experience) matter for AGI design, but I don't think they're
> > meaningless either
> 
> I agree. An AGI needs to be able to model human minds in order to
> communicate effectively with people. If the model didn't claim to be
> conscious then I would consider that a bug.
> 

They don't matter UNLESS engineering a real p-conscious entity verses an ersatz 
Google-like behavioral regurgitation, is easier to build and requires 
significantly less computational resources to run AND results in significantly 
more intelligence and human interactive assistance capabilities on said 
resources.

IOW which one is easier to build and which one runs better. My argument for 
p-consciousness AGI design is also based on engineering and runtime estimates.

John




-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to