Observing how the present diverges from the past should be useful, both for
becoming better able to control or capitalize on how nature works, but also
for better controlling ourselves to stop repeating past choices that would
be in error.   

I'm trying to share something of my experience and verifiable knowledge of
that, that is of some importance.   Some only see a fine line between
learning someone's tricks for making your own discoveries, and repeating
back the words they use to describe their own discoveries, but there's a
world of difference, of course.   I don't want to hear my empty words back,
I want to hear your full words reflecting your having made some of the same
observations.  Words are only meaningful if they represent shared
experience.  I think science can help us compare notes on our independent
observations of the divergent processes in nature, and to really learn
something by that.  

Growing rates and kinds of learning occur within relationship networks as
they multiply their organizational scale and complexity.  That applies to
projects that start small at home or work, to software, building plans or
businesses, industries, societies, etc, that get endlessly bigger in scale
and incorporate changes in kind ever faster.  I observe that when a complex
multiplication of relationships like that runs into an unexpected rush of
complications, it's often just before serious widespread failures occur.
It looks to me to be a signal that marks crossing a line toward
unmanageability for the system as a whole, marking an internal 'breaking
point'.  

Do any of you notice that rush of complications as a signal of self-controls
becoming, overextended, unresponsive and systems about to go "out of
control", like over driving the slop in your steering system?   It's also a
little like a juggler being thrown just one too many balls to keep in the
air all at once, and not dropping just the one but nearly all of them.  I
think it's a general property of divergent learning systems.   Do you guys
recognize any cases where organizational instability arises due to exceeding
the learning responses of the parts?  

If there were such a property of instability in growth, and if you
considered cybernetics to be the science of control, a principle of
self-control to avoid pushing learning responses out of control could be
called its "principal principle", i.e. don't overshoot.   That's what I
dubbed it anyway, the prudent choice to not push the learning demands of a
system beyond the responsiveness of its parts.  Does that make any sense in
terms of what you observe?



Phil Henshaw                                  ¸¸¸¸.·´ ¯ `·.¸¸¸¸
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
680 Ft. Washington Ave NY NY 10040                       
tel: 212-795-4844   e-mail: [EMAIL PROTECTED]    explorations:
www.synapse9.com    
"it's not finding what people say interesting, but finding the interest in
what they say" 






============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

Reply via email to