On Fri, Jul 29, 2011 at 7:48 AM, John Zabroski <[email protected]> wrote: > On Fri, Jul 29, 2011 at 9:03 AM, Wesley Smith <[email protected]> wrote: >> >> > I like to think about simplicity as coming up with the right core >> > abstractions and the optimal way to distribute complexity among them to >> > support a large set of use cases. >> >> >> This phrase comes up so much when talking about computational systems >> that I wonder if it can be made more tangible. It would be really >> interesting to see different sets of abstractions and some >> representation of the computational space that they cover.
> > We had a discussion on the FONC mailing list, around March 2010?, that > touched upon different ways of viewing complexity in a system. One person > gave an example using two pictures from a famous system's theory book, > another argued for Gelman's effective complexity metric, and so on. (I > think you may have replied to this discussion.) There are many examples in > computer science where two different logics, algebras or calculi are > required to have a complete definition of a system's properties. Axel > Jantsch has some interesting examples in a book of his I own. I guess I'm talking less about complexity and all of the different ways to interpret that word and instead am looking for a more specific if not mathematical way to walk about what computational primitives provide in combination and what kind of territory they cover before breaking down. This was just a thought in the moment and may not make too much sense though. wes _______________________________________________ fonc mailing list [email protected] http://vpri.org/mailman/listinfo/fonc
