Mark Waser wrote:
AGIs (at least those that could run on current computers)
cannot really get excited about anything. It's like when you represent
the pain intensity with a number. No matter how high the number goes,
it doesn't really hurt. Real feelings - that's the key difference
between us and them and the reason why they cannot figure out on their
own that they would rather do something else than what they were asked
to do.

So what's the difference in your hardware that makes you have real pain and real feelings? Are you *absolutely positive* that "real pain and real feelings" aren't an emergent phenomenon of sufficiently complicated and complex feedback loops? Are you *really sure* that a sufficiently sophisticated AGI won't experience pain?

I think that I can guarantee (as in, I'd be willing to bet a pretty large sum of money) that a sufficiently sophisticated AGI will act as if it experiences pain . . . . and if it acts that way, maybe we should just assume that it is true.

Jiri,

I agree with Mark's comments here, but would add that I think we can do more than just take a hands-off Turing attitude to such things as pain: I believe that we can understand why a system built in the right kind of way *must* experience feelings of exactly the sort we experience.

I won't give the whole argument here (I presented it at the Consciousness conference in Tucson last year, but have not yet had time to write it up as a full paper).

I think it is a serious mistake for anyone to say that the difference between machines cannot in principle experience real feelings. Sure, if they are too simple they will not, but all of our discussions, on this list, are not about those kinds of too-simple systems.

Having said that: there are some conventional approaches to AI that are so crippled that I don't think they will ever become AGI, let alone have feelings. If you were criticizing those specifically, rather than just AGI in general, I'm on your side! :-;


Richard Loosemore

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415&user_secret=e9e40a7e

Reply via email to