I have to admit, I do not understand what is going on here, either.  Doug's comment on the words used caught my attention, though.  While in most instances rhetoric is thought to flow FROM culture, there is a growing body that holds the opposite:  we create our culture though our use of words and narrative.  See:
  • Tribal Leadership--http://www.triballeadership.net/  Logan and co-authors explore organizational culture and productivity.  To anyone who is familiar with their work, we have seen a classic Level 3 in action.  The "we" and "they" observation of Nick is further amplification of the power of words.  Logan's thesis is that one has to change the rhetoric and the relationships to change the ideas, not the other way around.
  • Cynefin Framework--http://www.cognitive-edge.com/ Dave Snowden has a more direct connection with complexity theory, through his description of cause/effect and order vs "unorder".  He also describes narrative as a potent force for organizational action.

So what am I driving at?  Isn't emergence part of changing the "I" into "We" and beyond into "All of Us"?  Logan's work would indicate that the "emergence" formed in moving a culture from a Level 3 ("I'm great, and by the way, you aren't") through Level 4 ("We're great, and they are not") and to Level 5 ("Life is great") results in a non-linear jump in productivity.

Russ G.



Russell S. Gonnering, MD, FACS, MMM, CPHQ




On Jun 19, 2009, at 10:14 AM, Douglas Roberts wrote:

I've watched this particular verbal volleyball match for over a week now, and I must confess:  I don't have the faintest idea what the objective of the exercise is.  What I have noticed, however, is repeated usage of words that apparently have deep, overloaded, special meanings to their author, but not to the audience.

"Experience" "conscious" "suffers from", for example.

Could someone please tell me what the fuss is all about?  Succinctly?  Why are you all apparently agonizing over whether a robot can "feel" "nauseous"?

TIA (which stands for Thanks, In Advance)

--Doug

--
Doug Roberts
[email protected]
[email protected]
505-455-7333 - Office
505-670-8195 - Cell

On Fri, Jun 19, 2009 at 8:57 AM, Russ Abbott <[email protected]> wrote:
As I wrote to Nick directly, I think Nick is gracious and kind and a man of great integrity.

But this doesn't make sense to me: "We don't have to believe in inner minds to say that a person accused of dishonesty behaves as if deeply hurt." What could it possibly mean to say that a person is deeply hurt if there is no such thing as first person experience?  And if there is no such thing as being deeply hurt in a first person way, what could it possibly mean to say that someone is behaving as if deeply hurt?

This suggests that it is very dangerous to claim that there is no first person experience and that observable behavior is all there is. It would encourage "treating people as objects" because that's exactly the position it takes. An attitude of this sort would seem to discard millennia of progress in our understanding and acceptance of what ethical human-to-human interaction consists of.

-- Russ


On Fri, Jun 19, 2009 at 7:40 AM, John Kennison <[email protected]> wrote:


Nick and I are on opposite sides of the consciousness debate. I think there is an inner mind and that I experience it. Nick rejects statements not made from the third person perspective. Perhaps the debate suffers from a feeling that if we take Nick's third person view, we are not allowed to use metaphorical statements that suggest an inner mind. But clearly we can say "The computer had an illusion" or a "breakdown" etc. to describe behavior. (e.g. The behavior was as we imagined it would be if the computer had a inner mind which suffered a breakdown.) Moreover, not only can these metaphorical statements about behavior be defined rigorously, but we can formulate and test rules about how they are related. We don't have to believe in inner minds to say that a person accused of dishonesty behaves as if deeply hurt. That is why we should not casually make such accusations nor assume they will be without negative consequences even if there is no inner mind.



________________________________________
From: [email protected] [[email protected]] On Behalf Of Russ Abbott [[email protected]]
Sent: Thursday, June 18, 2009 11:07 PM
To: [email protected]
Cc: [email protected]; [email protected]
Subject: Re: [FRIAM] Nick and dishonest behavior

Nick wrote:

To call a man "dishonest" (my word, I admit, but you have embraced it) is very harsh in my world, and seems (to me) to require a level of certainty about another person's motives that I just don't know how you could come by from your limited experience with me.  ...

You are insisting on the correctness of your view of my mind based on inferences from my behavior.

Yes, I'm doing exactly that, judging you on the basis of your behavior -- in this conversation. (The past 40 years aren't relevant to that.) Your position in this discussion seems to be that your behavior is all there is. So why are you objecting that I'm doing it?

Furthermore, your objection seems to be that I don't know what your "motives" are.  I'm not sure what you mean by motives in this case. I'm not assuming any particular motive. In fact I'm confused about what your motives might be and why you are acting so dishonestly. Yet you are acting dishonestly.

To review: a good example of your dishonest behavior was your answer to my question about nausea. Your provided a very nice first person description of what it means to feel nauseous.

If you say that you are "feeling nauseous" i will understand that your world seems like it is churning around but that your visual cues do not confirm (i.e., you are dizzy) and that your stomach feels the way it does when on previous occasions you have thrown up.

Note your use of the first person words seems and feels. But  then you refused to answer whether that description would ever apply to a robot. Instead you offered a 3rd person description of what it looks like to feel nauseous and said that of course a robot could fit that description. I call that dishonest.  You know what a first person description means because you used it yourself. But then you refused to answer the question whether such a first person description could apply to a robot. Furthermore, you refused to acknowledge that this is what you were doing. I see that as dishonest. But I don't know what your motives for acting this way might be.

Besides, why are you so concerned about my characterizing your behavior as dishonest? Why is that a very harsh term? It's simply a description of your behavior.

Are you upset because you are taking my use of the term dishonest to apply more broadly than to your behavior? In the second passage of yours quoted above, you talked about my view of your mind. Are you unhappy that I seem to be implying that your mind is dishonest? I thought your position was that there is no mind for me to have a view of. I thought your position was that behavior was all that mattered. It should not matter to you what "my view of your mind" is if it doesn't mean anything to talk about minds.


-- Russ


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org



--
Doug Roberts
[email protected]
[email protected]
505-455-7333 - Office
505-670-8195 - Cell
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org






============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

Reply via email to