Talk about fortuitous timing . . . . here's a link on Marvin Minsky's latest about emotions and rational thought
http://www.boston.com/news/globe/health_science/articles/2006/12/04/minsky_talks_about_life_love_in_the_age_of_artificial_intelligence/ The most relevant line to our conversation is "Called "The Emotion Machine," it argues that, contrary to popular conception, emotions aren't distinct from rational thought; rather, they are simply another way of thinking, one that computers could perform." ----- Original Message ----- From: "Mark Waser" <[EMAIL PROTECTED]> To: <[email protected]> Sent: Tuesday, December 05, 2006 10:05 AM Subject: Re: [agi] A question on the symbol-system hypothesis >> Are >> you saying that the more excuses we can think up, the more intelligent >> we are? (Actually there might be something in that!). > > Sure. Absolutely. I'm perfectly willing to contend that it takes > intelligence to come up with excuses and that more intelligent people can > come up with more and better excuses. Do you really want to contend the > opposite? > >> You seem to have a real difficulty in admitting that humans behave >> irrationally for a lot (most?) of the time. > > You're reading something into my statements that I certainly don't mean to > be there. Humans behave irrationally a lot of the time. I consider this > fact a defect or shortcoming in their intelligence (or make-up). Just > because humans have a shortcoming doesn't mean that another intelligence > will necessarily have the same shortcoming. > >> Every time someone (subconsciously) decides to do something, their >> brain presents a list of reasons to go ahead. The reasons against are >> ignored, or weighted down to be less preferred. This applies to >> everything from deciding to get a new job to deciding to sleep with >> your best friend's wife. Sometimes a case arises when you really, >> really want to do something that you *know* is going to end in >> disaster, ruined lives, ruined career, etc. and it is impossible to >> think of good reasons to proceed. But you still go ahead anyway, >> saying that maybe it won't be so bad, maybe nobody will find out, it's >> not all my fault anyway, and so on..... > > Yup. Humans are not as intelligent as they could be. Generally, they place > way too much weight on near-term effect and not enough weight on long-term > effects. Actually, though, I'm not sure whether you classify that as > intelligence or wisdom. For many bright people, they *do* know all of what > you're saying and they still go ahead. This is certainly some form of > defect, I'm not sure where you'd classify it though. > >> Human decisions and activities are mostly emotional and irrational. > > I think that this depends upon the person. For the majority of humans, > maybe -- but I'm not willing to accept this as applying to each individual > human that their decisions and activities are mostly emotional and > irrational. I believe that there are some humans where this is not the > case. > >> That's the way life is. Because life is uncertain and unpredictable, >> human decisions are based on best guesses, gambles and basic >> subconscious desires. > > Yup, we've evolved to be at least minimally functional though not optimal. > >> An AGI will have to cope with this mess. > > Yes, so far I'm in total agreement with everything you've said . . . . > >> Basing an AGI on iron logic >> and 'rationality' alone will lead to what we call 'inhuman' >> ruthlessness. > > . . . until now where you make an unsupported blanket statement that doesn't > appear to me at all related to any of the above (and which may be entirely > accurate or inaccurate based upon what you mean by ruthless -- but I believe > that it would take a very contorted definition of ruthless to make it > accurate -- though inhuman should obviously be accurate). > > Part of the problem is that 'rationality' is a very emotion-laden term with > a very slippery meaning. Is doing something because you really, really want > to despite the fact that it most probably will have bad consequences really > irrational? It's not a wise choice but irrational is a very strong term . . > . . (and, as I pointed out previously, such a decision *is* rationally made > if you have bad weighting in your algorithm -- which is effectively what > humans have -- or not, since it apparently has been evolutionarily selected > for). > > And logic isn't necessarily so iron if the AGI has built-in biases for > conversation and relationships (both of which are rationally derivable from > it's own self-interest). > > I think that you've been watching too much Star Trek where logic and > rationality are the opposite of emotion. That just isn't the case. Emotion > can be (and is most often noted when it is) contrary to logic and > rationality -- but it is equally likely to be congruent with them (and even > more so in well-balanced and happy individuals). > > > > ----- Original Message ----- > From: "BillK" <[EMAIL PROTECTED]> > To: <[email protected]> > Sent: Tuesday, December 05, 2006 7:03 AM > Subject: Re: Re: Re: Re: [agi] A question on the symbol-system hypothesis > > >> On 12/4/06, Mark Waser wrote: >>> >>> Explaining our actions is the reflective part of our minds evaluating the >>> reflexive part of our mind. The reflexive part of our minds, though, >>> operates analogously to a machine running on compiled code with the >>> compilation of code being largely *not* under the control of our >>> conscious >>> mind (though some degree of this *can* be changed by our conscious >>> minds). >>> The more we can correctly interpret and affect/program the reflexive part >>> of >>> our mind with the reflective part, the more intelligent we are. And, >>> translating this back to the machine realm circles back to my initial >>> point, >>> the better the machine can explain it's reasoning and use it's >>> explanation >>> to improve it's future actions, the more intelligent the machine is (or, >>> in >>> reverse, no explanation = no intelligence). >>> >> >> Your reasoning is getting surreal. >> >> As Ben tried to explain to you, 'explaining our actions' is our >> consciousness dreaming up excuses for what we want to do anyway. Are >> you saying that the more excuses we can think up, the more intelligent >> we are? (Actually there might be something in that!). >> >> You seem to have a real difficulty in admitting that humans behave >> irrationally for a lot (most?) of the time. Don't you read newspapers? >> You can redefine rationality if you like to say that all the crazy >> people are behaving rationally within their limited scope, but what's >> the point? Just admit their behaviour is not rational. >> >> Every time someone (subconsciously) decides to do something, their >> brain presents a list of reasons to go ahead. The reasons against are >> ignored, or weighted down to be less preferred. This applies to >> everything from deciding to get a new job to deciding to sleep with >> your best friend's wife. Sometimes a case arises when you really, >> really want to do something that you *know* is going to end in >> disaster, ruined lives, ruined career, etc. and it is impossible to >> think of good reasons to proceed. But you still go ahead anyway, >> saying that maybe it won't be so bad, maybe nobody will find out, it's >> not all my fault anyway, and so on..... >> >> Human decisions and activities are mostly emotional and irrational. >> That's the way life is. Because life is uncertain and unpredictable, >> human decisions are based on best guesses, gambles and basic >> subconscious desires. >> >> An AGI will have to cope with this mess. Basing an AGI on iron logic >> and 'rationality' alone will lead to what we call 'inhuman' >> ruthlessness. >> >> >> BillK >> >> ----- >> This list is sponsored by AGIRI: http://www.agiri.org/email >> To unsubscribe or change your options, please go to: >> http://v2.listbox.com/member/?list_id=303 >> > ----- This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?list_id=303
