On 04/10/2013 09:17 AM, Nicholas Thompson wrote: > I have yet to integrate my thinking about "convergence" (preferable to > "consensus", I think) with the stuff about recursion, which was near-30 > years ago. It was the sort of thing that I though Peter Lipton and I might > do when we were old. Not sure I am man enough to do it alone. I think > Peirce would say ... particularly the later Peirce ... that in recursive > explanations lurks a form of "right-thinking" that cannot be described in > the terms of formal logic
I actually distrust consensus and convergence, equally, I think. This is for the same reason I think the "singularity" concept is suspicious. It implies a closedness that I don't believe in. The universe seems open to me, which implies that any process (including explanation) _wanders_ significantly. I will admit constraints, though. Although any process may wander, it may do so within some hard boundaries ... like a sandwiched series that forever oscillates without actually converging. Anyway, re your paper: The concept of filter explanations may end up being quite useful to me for the same reason that abduction is useful to me. For most of my career, I've tried to explain to my fellow simulants that any particular snapshot of a modeling effort is not very useful. I.e. any particular _model_ is not very useful (with an anti-authoritarian prejudice against the much-abused "all models are wrong, some are useful" aphorism -- I actually think that aphorism has done more damage to the proper way to use simulation than any other concept). But the whole modeling and simulation (M&S) effort (trajectory or bundle of trajectories, given model forking) _is_ useful. The distinction I would draw is that I don't think of these efforts or the filter explanations you describe in the paper as recursive so much as _iterative_. Recursion, to me, implies a kind of "normalized" data, just like your "distinguishes X's that are Y from X's that are not-Y". Iteration doesn't usually take advantage of it's more general nature. But it's still there. You can perform the same process regardless of the type of the thing to which it's applied. Recursion implies that the result of applying the process will produce something that can be processed by the process. In other words, iteration is "doing it again" and recursion is "doing it to the result of the last time you did it", making recursion more specific. Hence, recursion targets a more closed type chain. This is important to me because my work is multi-formalism, the model produced in one iteration can be wildly different from the model produced in prior or subsequent iterations, different in generating structure and dynamics as well as phenomenal attributes. Hence I like the concept of filter explanations better than that of recursive explanations, where the filter can co-evolve with the stuff being filtered. > By the way, there is a truly excellent summary of Peirce's thought, called > On Peirce ... just a hundred pages ... and expensive for all of that ... > just a pamphlet, really, .... but worth every penny, by Cornelis DeWaal > (Wadsworth). My Peirce mentor also approves of it. Thanks. I've added it to my Powell's wishlist. -- glen e. p. ropella http://tempusdictum.com 971-255-2847 ============================================================ FRIAM Applied Complexity Group listserv Meets Fridays 9a-11:30 at cafe at St. John's College to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
