Frank A. Christoph <[EMAIL PROTECTED]>  writes

> It seems to me that a compiler would be an ideal candidate for 
> being written in an imperative language. The number of times GHC 
> has been too slow and memory-hungry for me indicates that Haskell 
> is not suitable for writing anything as general-purpose as a compiler.


Haskell implementors, 

please, provide the compiler written in the pure functional Haskell 
language,
just - in the true functional subset.

No matter how slow and memory-hungry it will be.
It is only as expensive in running as the very developed Haskell 
implementation is in-efficient.
Using other language would mean the developers themself do not hope to 
make their tool efficient.

This is common for all the high-level functional languages.

Another point.
People understand the word `compilation' very differently.

(-Onot)    understanding:
           obtain the code fast, the code should not be too bad. 

(-O)       apply several standard optimizations: in-lining and others

(-O-problem)  think hard to solve problems and try to generate the 
              cleverest code

To my mind, people think only of the (-Onot), (-O) modes when they 
talk of the compiler efficiency. This is a tradition of the epoch of 
primitive programming. And the compilation is understood mainly as 
converting to the machine code.

But you see, (-O-problem) may be quite different.
I think, it has to be mainly the program transformation, not the 
assembly (or C) coding.
Imagine the user program for the sorting. The user does not want to 
think much and implements it as sort by insertion: 4-8 lines of source. 
Under (-O-problem), we could imagine the compilier that thinks 10 
hours, takes 500 Mbyte memory and produces the code for the merge sort 
algorithm. Only imagine also that there is a compiler option for the 
asymptotic cost optimization.

Is such compiler a "slow monster"?
By no means. It is a quickest and cleverest guy. Because it had 
produced a program that is billions times faster for the large data.
Because other "quick" compilers (written in C, in assembly, in 
hardware!) may work many years and still obtain worse code. 
This is like the situation in the scientific research. 

One could reply: 
"We cannot think now of such a clever compiler. It is not real."

First, I think, this example with the sorting will become real soon.
Second, there are many problems of the middle difficulty.
For example, do you expect
                           map negate $ map negate [1..1000] ::[Int]
to transform to            [1..1000]  :: Int

by the recent compilers - under, say, -O2 key?
I do expect, and annoy when they do not transform.

In other words, when speaking of the compiler efficiency, let us 
recall that there also exists some sort of the higher optimization 
mode. And there, you would hardly change anything by the C or 
assembly programming.

------------------
Sergey Mechveliani
[EMAIL PROTECTED]







Reply via email to