> My first really strong experiences with programming came from the 
> data-structures world in the late 80s at the University of Waterloo.

Glad to know that the approach I was taught at Waterloo 2 decades later didn't 
change that much :)

Seriously though, I think the lack of better ways to manage artificial 
complexity indicates our field is lacking a "true architecture". Rome 
(aqueducts, arches and great buildings) was not built just because the romans 
had more bricks and slaves than the egyptians.

> There was an implicit view that one could decompose all problems into 
> data-structures (and a few algorithms and a little bit of glue).

I actually agree completely. Rich Hickey is clearly in this camp with his 
approach to clojure and exemplified in his talk "Simple Made Easy". 
http://www.infoq.com/presentations/Simple-Made-Easy

I think the problem is that OO has been pitched as an alternative to structured 
programing rather than a complement. For example, because you can make a 
function call (c-style) doesn't mean that what happens inside a function isn't 
sequential machine instructions - it's just a way of abstracting. Similarly, 
objects should NOT be about the data + code point of view that java and c++ 
have pushed but about how it creates a boundary (like lipid membranes for 
cells) - the boundary and messages are what important. What happens "inside" 
very well might be a data driven / decomposition approach.


> But for many of the things that I've built in the back-end I find that OO 
> causes me to jump through what I think are artificial hoops.

Totally agreed, when I'm "inside" the "back-end" I really want to look at data 
structures and operate on them. Inevitably, to allow "other things" to 
interface with and use the backend, I come up with a 1-off facade pattern.

- collection of name-spaced c functions that take an opaque struct
- the sys-call interface to an OS
- countless protocols to remote servers (rpc, http, soap...)

I find "Delimited continuations in operating systems" Oleg Kiselyov and 
Chung-chieh Shan really interesting in highlighting this.


> Over the years I've spent a lot of time pondering why. My underlying sense is 
> that there are some fundamental dualities in computational machines. Static 
> vs. dynamic. Data vs. code. Nouns vs. verbs. Location vs. time. It is 
> possible, of course, to 'cast' one onto the other, there are plenty of 
> examples of 'jumping' particularly in languages wrt. nouns and verbs. But I 
> think that decompositions become 'easier' for us to understand when we 
> partition them along the 'natural' lines of what they are underneath.

Yes, I think that's one of the points of fonc - find more optimal 
representations of meaning and representations of execution. Goedel Escher Bach 
is at least in part focused heavily on this duality between encodings of 
meaning in static representations vs dynamic executions of systems. A finite 
static representation can represent something infinite at execution (e.g. 
regex's Kleene*). Likewise, finite computation descriptions can take an 
infinite amount of static meaning definition for the generated artifact (any 
sort of fractal generating code).


> My thinking some time ago as it applies to OO is that the fundamental 
> primitive, an object, essentially mixes its metaphors (sort of).

Yes, because I think the meaning of object in pop-culture is not the right 
thing. They took what they knew (structured programing) and the "news" of 
objects and made algebraic data types with named slots (dicts that are a finite 
map).

The metaphor I use is imagine if mechanical engineering worked like "OO" 
software. 1 engineer whips up a steel I-beam in CAD and commits it to the repo. 
Internally it's obviously made up of atoms which have quarks with spins. 
Externally, steel beams have well understood properties and will hold up a sky 
scraper when done right. Another engineer comes along, sees this beam thing and 
notices that it has almost the same structure as this rubber bungie cord he 
needs to build. Thankfully, the inner quark "object" has setters for count and 
spin - "SWEET code re-use!" he thinks.

It's the type of thing you could never do in real life.

You might think this is a hyperbole, but until the lipid membrane came along 
there could NOT be life since everything was a violent pile of ionized atoms. 
Electrons would be ripped away and molecules ripped apart. The information 
encoded in DNA could NOT exist until the lipid membrane came along - a 
protective membrane to allow separation of what kind of "computation" could 
happen "where" + a way of "sensing" and knowing things that happen around "you".

Simply, I think we need a better abstraction tool beyond what we have today 
with the function, Yes, the abstraction might be "built / simulated" out of 
functions - see the lambda papers. Lipid membranes are also just an illusion of 
encapsulation - permeable things made out of the same(ish) low level stuff as 
DNA. Penicillin annihilates the cell wall of certain bacterium and their guts 
just explode.

shawn
_______________________________________________
fonc mailing list
[email protected]
http://vpri.org/mailman/listinfo/fonc

Reply via email to