On Mon, Sep 22, 2014 at 10:26 AM, John Rose via AGI <[email protected]> wrote:
> If there were 2 AGI's, one p-conscious and one that merely had a belief if 
> was p-conscious would you feel the same when unplugging each?

Let's not confuse the 3 meanings of the word "consciousness".
1. A mental state of wakefulness.
2. A soul or homunculus, an identity separate from the brain but with
identical behavior, whose belief is reinforced by the sense of qualia:
the pleasant feeling we associate with storing perceptions and
thoughts into episodic memory.
3. A property of agents that makes it immoral to harm them.

How do I feel about aborting a fetus? How do I feel about eating meat?
I am not aware of any objective test for right and wrong. Lots of
people have proposed rules based on their opinions, which can be quite
strong. This sometimes leads to war.

> Could you input some original poetry into Google and ask how it feels about 
> it?

With sufficient training data, you could write a program that would
input poetry and predict how people would feel about it. Then with a
small modification, the computer could claim to feel the same way.


-- Matt Mahoney, [email protected]


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to