If you had an AGI with a cognitive architecture somewhat human-like, it would be interesting to take a human brain and "re-route" parts of the human brain's processing away from selected brain components to corresponding components of the AGI
For instance, what if you rewired someone's brain to bypass their hippocampus, and route to some AGI's "hippocampus analogue"..... What would it feel like? By gathering a bunch of subjective reports of this nature, and correlating these subjective reports with the particulars of the rewiring undertaken in each instance, one could gather a corpus of data that would let one move toward formulating a theory spanning the subjective and objective aspects of consciousness... In this case, the way to convince a skeptic (someone saying "the AGI's not really conscious" or whatever) would be not just to show them some data or verbal reports, but to invite them to temporarily connect **their own brain** to the apparatus. Then they would feel what it's like to be part-AGI for themselves. This, if it really did feel like something (if you could subjectively sense the AGI-aspect inside "your mind"), would convince 99% of skeptics... Anyway I think this kind of direction is how the "hard problem of consciousness", as well as the issues regarding machine consciousness, is going to be "solved" eventually. This approach won't answer every question we're currently interested in, just as modern physics doesn't tell us how many angels can dance on the head of a pin (as medieval philosophers wondered), but it will guide us regarding what are the right questions to ask, and enable us to build a new sort of understanding obsoleting our current concepts of objectivity and subjectivity... -- Ben G On Mon, Sep 22, 2014 at 11:17 AM, Matt Mahoney via AGI <[email protected]> wrote: > On Sun, Sep 21, 2014 at 7:56 PM, John Rose via AGI <[email protected]> > wrote: > >> -----Original Message----- > >> From: Matt Mahoney [mailto:[email protected]] > >> > >> > >> Are you saying that a p-zombie AGI would behave differently than a > >> conscious AGI? Now we're getting somewhere. Exactly how would you test > if > >> an AGI really "understands" us? > >> > > > > On second thought I don't know which is more dangerous, a > non-p-conscious AGI or a p-conscious AGI. > > I would suspect a p-conscious AGI to be more dangerous if I believed > in p-consciousness. That is because I would also assume it had free > will. But free will is also an illusion. It is caused by positive > reinforcement of actions. Again, we have it because it increases our > reproductive fitness. You would not want to live if you did not enjoy > doing things. The side effect of the pleasure of doing things is that > it reinforces the belief that there is a "me" that decides to do them, > rather than deterministic neural processes sending signals to my > muscles. > > We associate p-consciousness with understanding. If I want to test > whether an AGI understands me, I would test it like I would test a > human. I would ask it to rephrase what I just said. The problem is > that we already have machines that can do this. When you search on > Google, it will match phrases that have the same meaning but different > words. > > We don't think Google is conscious. To me it is irrelevant as long as > it understands me. > > -- > -- Matt Mahoney, [email protected] > > > ------------------------------------------- > AGI > Archives: https://www.listbox.com/member/archive/303/=now > RSS Feed: https://www.listbox.com/member/archive/rss/303/212726-deec6279 > Modify Your Subscription: > https://www.listbox.com/member/?& > Powered by Listbox: http://www.listbox.com > -- Ben Goertzel, PhD http://goertzel.org "In an insane world, the sane man must appear to be insane". -- Capt. James T. Kirk "Emancipate yourself from mental slavery / None but ourselves can free our minds" -- Robert Nesta Marley ------------------------------------------- AGI Archives: https://www.listbox.com/member/archive/303/=now RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424 Modify Your Subscription: https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657 Powered by Listbox: http://www.listbox.com
