> > For me, it's a very important property of a programming language. OO, > > if done right, seems to naturally fit the way I think. > So first, I think this is learned behavior. I'm old enough to > personally remember when OO went mainstream, and people thought it was > incredibly alien. And with apologies, we should ask whether the > "natural fit" indicates a positive property of OO or a negative > property of how we are training programmers.
The question is justified. However, with all due respect, I tend to believe that the answer is that it's not all just learned behaviour. I'm old enough to have been educated pre-OO (well, technically Smalltalk and CLOS already existed, but not on the kind of computers I had access to), and I have seen and used quite a number of different programming paradigms in my life. And they are not all equal, I remember with horror the kind of nonlocality and spaghetti code one can end up with in some of the lesser languages. There are two aspects here that I think are relevant. Number one is the power of the abstractions provided by a language: does it allow me to write code that's compact, where every function can locally be judged to be correct, and where adding features doesn't require changes all over the program? And number two is the "brain mapping": how well are the abstractions suited to the way I think about the world? As for the brain mapping, there are two arguments to be made. The first one is that there is reason to believe that some of the aspects of how we think are hard-wired into our brains. I found the presentations on synesthesia by Vilayanur Ramachrandran (on youtube or TED) to be quite enlightening in this respect: it turns out that quite a common synesthesia, perceiving numbers as having a specific color, happens because the area in our brain that represents numbers lives next to the area that represents colors, and sometimes, cross connections form. What I am taking from this is that the concept of numbers is hardwired into our brains. I'm pretty sure that abstraction is also hardwired. You can't start counting things without the ability to perceive the individual objects as belonging to a common class. Archeologists argue that abstraction as a human trait can be shown in artefacts 77000 years old (http://www.scientificamerican.com/article.cfm?id=ancient-engravings-push-b). Now what about classification, then? I've found no literature on the neurocognitive side of classification, a neurobiologist I've asked is still searching. However, it seems to me that even for very small children there's no conceptual problem of perceiving an individual apple to be a member of the set of all apples, and therefore a member of the set of all fruits, because the apple set is a subset of the fruit set. They might be using simpler terms to express that fact, though. :) Also, and this brings me to the second brain mapping argument, there's quite a bit of history of applying classification to explain the world around us. Even if classification were just a learned trait (and my personal belief is that it's not), it is pervasive, and its documented history in science goes back to Greek philosophers like Parmenides and Aristotle and their ontology as part of metaphysics. It's good design not to violate cultural expectations. If you label the hot water tap in blue and the cold water tap in red in your house, people will end up burning themselves. Please don't understand my arguments as saying that any of the the existing languages gets OO right. Quite to the opposite, my personal opinion is that OO in Java is not expressive enough, and in C++, it's a horrible mess. CLOS-style OO is better, but still not perfect. But I'd like to hold up that deep in there, there's a good idea. > When I look at C++ or C#, or Java classes, I see two distinct things: > > - A "type inheritance" mechanism providing restricted refinement > (virtual functions) > - An implementation reuse mechanism. Wait a second, there's more to it. A class in these languages also defines storage, a namespace and a subtyping relationship. Quite a mess. > The two are mixed together in a somewhat unfortunate way. If "class A" > is truly a type, we might expect that there could be two > implementations of this type, just as there can be many function > implementations that satisfy the type int->int. In C++/C#/Java you can > only do this by further extending the type, so types and > implementations are hopelessly mingled. I'm not sure I buy this argument. For starters, it confuses verbs and nouns (a.k.a. functions/methods and data/instances/objects/memory regions. Note that all human languages have verbs and nouns, and all programming languages have code and data). I see no reason why types of objects should behave in the same way as types of functions, they just might be naturally different. Then again, one could argue that it is the object instances that satisfy the class type in the same way function instances satisfy the function type. Finally, there's the concept of abstract base classes which pretty much does what you desire. I do agree that C++/C#/Java classes hopelessly mingle a lot of orthogonal concepts, though. The very fact that virtual methods are "part" of the object is disturbing me. The CLOS world has it much clearer: there are classes in the sense of data storage with a hierarchical taxonomy, and there are generic functions providing polymorphism. > If we separate class type definitions from class implementation > definitions, then we no longer have a conventional OO language. But we > can still represent all of the OO patterns in the language that we > *do* have. I don't see the win compared to what's already there with abstract base classes. But then, I probably have a different and much poorer idea of what a type is than you do. In the hope that my reflections are useful to you, Andreas _______________________________________________ bitc-dev mailing list [email protected] http://www.coyotos.org/mailman/listinfo/bitc-dev
