I am new to haskell, but I find your assertions surprising, given
that from my experience the really performance critical code is
little, and the reset can be even interpreted.
As far as I know C/C++ or similar are not really that advanced with
respect to whole program optimization (not much more than inlining).
I had the impression that haskell, until the shootout push, was not
good at optimizing/had not optimized libraries for some common
computational kernels, but now is in a much better shape (for ghc),
and with Don is doing, hopefully it will stay so.
Can you corroborate a little more your points?
cheers
Fawzi
On Feb 26, 2007, at 3:43 AM, Andrzej Jaworski wrote:
It sounds reasonable. However knowledge of how program performs in
micro-steps does not add up, so the benchmarks may wet up appetite
for lunch
that does not come. I have pointed into such example - an
astonishing and
unexplained underperformance of Haskell with all the profiling
information
at hand.
I guess Haskell compilers are not particularly good at detecting
specific
properties of a program and hence with optimizing it. This however
shows up
with size so Donald's benchmarks cannot catch that out.
For this reason, undiagnosed and untreated, Haskell has been
abandoned for
example in Algebraic Dynamic Programming, in spite of its unparallel
expressive power and a lot of hope. In ILP/IFP and GP it failed too.
Cheers,
--Andrzej
_______________________________________________
Haskell mailing list
Haskell@haskell.org
http://www.haskell.org/mailman/listinfo/haskell
_______________________________________________
Haskell mailing list
Haskell@haskell.org
http://www.haskell.org/mailman/listinfo/haskell