> From: Jiri Jelinek [mailto:[EMAIL PROTECTED] > On Wed, Nov 12, 2008 at 2:41 AM, John G. Rose <[EMAIL PROTECTED]> > wrote: > > is it really necessary for an AGI to be conscious? > > Depends on how you define it. If you think it's about feelings/qualia > then - no - you don't need that [potentially dangerous] crap + we > don't know how to implement it anyway. > If you view it as high-level built-in response mechanism (which is > supported by feelings in our brain but can/should be done differently > in AGI) then yes - you practically (but not necessarily theoretically) > need something like that for performance. If you are concerned about > self-awareness/consciousness then note that AGI can demonstrate > general problem solving without knowing anything about itself (and > about many other particular concepts). The AGI just should be able to > learn new concepts (including self), though I think some built-in > support makes sense in this particular case. BTW for the purpose of my > AGI R&D I defined self-awareness as a use of an internal > representation (IR) of self, where the IR is linked to real features > of the system. Nothing terribly complicated or mysterious about that. >
Yes, I agree that problem solving can be performed without self-awareness and I believe that actions involving rich intelligence need not require consciousness. But yes it all depends on how you define consciousness. It can be argued that a rock is conscious. > >Doesn't that complicate things? > > it does > > > Shouldn't the machines/computers be slaves to man? > > They should and it shouldn't be viewed negatively. It's nothing more > than a smart tool. Changing that would be a big mistake IMO. Yup when you need to scuttle the spaceship and HAL is having issues with that uhm it would be better for HAL to understand that he is expendable. Though there are AGI applications that would involve humans building close interpersonal relationships for various reasons. I mean having that AGI psychotherapist could be useful :) And advanced post-Singularity AGI applications, yes, I suppose machine consciousness and consciousness uploading and mixing, ya, in the meantime though for pre-Singularity design and study I don't see machine consciousness as required, human equiv that is. Though I do have a fuzzy view of how I would design a consciousness. > > >Or will they be equal/superior. > > Rocks are superior to us in being hard. Cars are "superior" to us when > it comes to running fast. AGIs will be superior to us when it comes to > problem solving. > So what? Equal/superior in whatever - who cares as long as we can > progress & safely enjoy life - which is what our tools (including AGI) > are being designed to help us with. > Superior meaning - if it was me or AGI-X due to limited resources does AGI-X get to live and I am expendable. Unfortunately there are many computer systems now, domain specific intelligent ones where their life is more important than mine. Some would say that the battle is already lost. John ------------------------------------------- agi Archives: https://www.listbox.com/member/archive/303/=now RSS Feed: https://www.listbox.com/member/archive/rss/303/ Modify Your Subscription: https://www.listbox.com/member/?member_id=8660244&id_secret=120640061-aded06 Powered by Listbox: http://www.listbox.com
