Rich, > > I'm just now picking up this conversation, so forgive me for coming > in late. > > This all reminds me of a book I'm sure other people here have read, > Tainter's "The Collapse of Complex Societies." His thesis, > basically, > is that societal complexity, however you measure it (diversity of > artifacts and technology, education and specialized skill sets, > social structures), represents an investment of energy that allows > for more intensive exploitation of resources. As this investment > increases diminishing marginal returns are unavoidable and the > surpluses required to address new problems disappear, making the > society susceptible to collapse. He also talks about how > collapse can > be the rational decision on the part of a societies members. > He gives > the example of how Romans living on the frontiers of their empire > would invite in hoards of marauding tribes because they were less of > a blight on their lives than the Roman tax collector. I haven't read it yet. Thanks for the suggestion. I tend to think there are several natural causes for sudden system failures and deaths, and we don't know much about them. My two favorites are the collapse of the USSR and the collapse of the NYC crime wave in 1991. They booth just seemed to go whoosh. The system failure I've been talking about is when stress is pushed to overwhelm the internal response mechanisms of a system driven to endlessly grow. The demands on the response network increase exponentially and it fails, something like the response to Katrina, but in slow motion is what I'd expect.
> > In this conceptual model, the only thing that can at least > temporarily increase these marginal returns and hold off a society's > collapse is the discovery of additional sources of energy. In fact there are lots of things that can change the course of growth system events. That's the problem with the older Limits Of Growth studies. They used stocks and flows of measurable commodities, and there are lots of important immeasurable commodities. > > One thing I felt was interesting was his sometimes detached view on > whether or not complex societies are good or bad things. He had > criticized lots of other work on collapse as being too heavily laden > with value judgments to be scientifically credible, so I > guess he was > being extra cautious about his own. Well, was it probing story telling, just good journalism? That counts a lot. Malcolm Gladwell does pretty well playing with simple ideas and telling stories about real world systems of different kinds. > But one thing it did impress upon me, is that humanity has spent the > vast majority of its time in very simple, band level societies. As > such, we should probably think of complex societies as deviations > from this norm, an experiment. > > From this perspective, I think we, as humans, are pretty > robust, but > the complex societies we're so enamored of are extremely fragile. > Since we are the very bacteria inhabiting the metaphorical > Petri dish > of this experiment, its hard to take such a detached point of view. Sure, in a way I agree humans would be better off planted far enough apart so they couldn't harm anything. I don't think there's any chance of our discarding cities though. This behemoth occupation of the planet might be another question perhaps, but I think we like cities. What do you think will be the end of our explosively accelerating economic expansion, figuring out a way to let it ease off that's healthy and sustainable, or letting it go to systemic collapse? Phil > Rich > > BTW, > > Since we currently put more energy into agriculture than we get out, > I wonder if agricultural energy crops and byproducts can really save > us. Does anyone out there have an opinion? > > > > On Jun 4, 2006, at 7:42 AM, Carlos Gershenson wrote: > > >> I think that assumes that cause and effect for any one system is > >> statistical across all systems. I don't believe that to > be the case. > >> Given a cellular system like an economy, where you can't really > >> transcend the basic cells, the humans with all our gifts and > >> failings, > >> there seem likely to be response time failure thresholds where ever > >> bigger repercussions get ever slower and less reliable corrections, > >> and > >> stabilizing the rapidly changing internal and environmental > >> relationships fails. > > > > I think that it is common to think that human society is fragile. > > Well, the fact that we're still around shows that we aren't. Last > > week, I learned about two competing "doomsday theories" from LANL > > people: bird flu, and peak oil. They both assume that small > > catastrophes trigger chaos. But even if nuclear war breaks > out, that > > wouldn't erase mankind from the face of the earth. It would > suck, for > > sure, and all these scenarios make profitable blockbusters, but we > > humans are a persistant little vermin... In any of these > cases society > > would change, for sure, but precisely that is part of the > adaptation. > > It wouldn't collapse. It hasn't collapsed, and there have > been plenty > > of wars, famines, plagues, and all other things mentioned in the > > Apocalypse... and we're still around. So I find it extremely > > unprobable that something would wipe us out. I am not > suggesting that > > mankind will be forever on Earth, but that evolving into something > > else seems to me more probable than extinction by catastrophe. > > > >> Asteroids might be a problem, and failures of imagination > might be of > >> seeming equally stubborn nature. I mean, if we've gone > and built an > >> entire civilization, business plan and government > financing structure > >> that relies on continual exponential increases in the > complexity of > >> the system,... and that turns out to be really dangerous, > it's quite > >> a major > >> failure of imagination it seems to me. > > > > If the complexity growth would fade away, I don't see civilization > > collapsing, so I don't understand why do you say that we rely on > > increasing complexity, nor why this might be dangerous. > > > >> I definitely think we should > >> make government competent by design. There are lots of do's and > >> don'ts > >> regarding performance measures, but if departments > developed concepts > >> of productivity beyond just bean counter efficiency, > having internal > >> groups > >> competing would be highly very productive. > > > > Indeed, there are many things to be improved. Some people > might think > > that there is no pressure for improving services. That is the case > > when there is no political choice (like in dictatorships or pseudo- > > democracies). But if there are competing political forces, > they will > > try to improve government to gain more votes. So, slowly (maybe too > > slowly), but surely, we're getting there... > > > > Best regards, > > > > Carlos Gershenson... > > Centrum Leo Apostel, Vrije Universiteit Brussel > > Krijgskundestraat 33. B-1160 Brussels, Belgium > > http://homepages.vub.ac.be/~cgershen/ > > > > "There is no game in which you cannot cheat" > > > > > > > > ============================================================ > > FRIAM Applied Complexity Group listserv > > Meets Fridays 9a-11:30 at cafe at St. John's College lectures, > > archives, unsubscribe, maps at http://www.friam.org > > > ============================================================ > FRIAM Applied Complexity Group listserv > Meets Fridays 9a-11:30 at cafe at St. John's College > lectures, archives, unsubscribe, maps at http://www.friam.org > > ============================================================ FRIAM Applied Complexity Group listserv Meets Fridays 9a-11:30 at cafe at St. John's College lectures, archives, unsubscribe, maps at http://www.friam.org
