Hi! On the performance (or not) of high level code: I'm working on a compiler with a strong emphasis on generating good code for programs written in a fairly generic style. This work is very far from being completed, but some of the highlights of the compiler are:
- Aggressive removal of higher-order functions, but not when functions are truly used as data. So everything like foldr, map, compose and similar user defined functions will be removed, cross module. It's the cross-module part that's difficult and innovative; otherwise it's just plain old partial evaluation. - I plan to use deforrestation as well, but I have not decided on the details yet. - Cloning for generating different versions of functions being called from different call sites. Eg a nonstrict function might be legal to optimize more aggressively if it is only called with evaluated arguments. It is going to be interesting to see how much this will give. I suspect that part of the performance problems of (lazy) functional languages come from their encouraging the programmer to use linked data structures rather than arrays and similar biases, rather than just from overhead. Chrees, /kff _______________________________________________ Haskell mailing list [EMAIL PROTECTED] http://www.haskell.org/mailman/listinfo/haskell