[ The Types Forum, http://lists.seas.upenn.edu/mailman/listinfo/types-list ]
One thing this whole conversation ignores is the role of types in compilation. Even early FORTRAN compilers needed to make distinctions (integral versus floating-point) in order to (a) determine how much space to allocate, (b) [possibly] align values in memory, and (c) to dispatch to appropriate instructions for doing arithmetic (i.e., "+" was overloaded.) These distinctions are still relevant and important today. Indeed, types have played a central role in efficient compilation of languages, and in that sense, are extremely enabling. A uni-typed language, in the parlance of Harper, requires a universal representation, which usually means a level of indirection, perhaps additional tags so that we can discriminate, etc. Today, a good compiler for a uni-typed language first reconstructs the underlying "Scott-types" (or some refinement thereof) in the hopes of avoid the costs of universal representations. (See e.g., the JVML, LLVM, GHC's IR, etc.) So the practice and theory are quite aligned here: types form the foundations, and the illusion of a dynamically-typed setting is there for convenience of the user. Even in the uni-typed languages, not all operations are allowed on all values (e.g., trying taking the car/head of a lambda), even though these operations are certainly defined at the bit-level. So there is some imposition of abstraction before the fact, enabling reasoning at the level of the language's abstract machine, instead of the concrete bit-level. I suppose if you turn your head around backwards enough, you see this as an "imposition" on your God-given right to do what you want with the bits, but others see this as a constructive way to define a language. -Greg
