Mon, 24 Jul 2000 19:01:42 +0100, Claus Reinke <[EMAIL PROTECTED]> pisze:

> They might download one of the libraries that looks portable, 
> and find it working, more or less, but with quite a few changes 
> that would have to be made to it, too. Then they might look for
> todo-lists, release plans, etc., but those are all just guesses, 
> telling us what the authors would have liked to do some time 
> ago (planned release dates are much better, but what if they 
> lie in the past? Wait for some months for the imminent new 
> release? Contact the authors, who -presumably- would have
> fulfilled their own promises, had they been able to?).

Interesting that:
1. I agree that "recently touched" is a good measure of usefulness of
   a Haskell library, but
2. it does not seem to apply to C libraries. At least not that much.

There exists a problem of "how to make Haskell more useful for
practical use" (it's a problem for many languages other than C). For
me it's a big problem - I cannot honestly promote Haskell where I
would want to. For improving this situation, perhaps we should think
what to do to make "recently touched" not as important parameter,
i.e. to make old libraries useful for new programs.

I guess that one reason is rapid evolution of the language, including
basic technical libraries. It's unfortunate, because I love using a
well-suited language construct or library even if it's non-standard,
instead of forcing the interface to old constraints, and I hate
glitches caused by backward compatibility, or compatibility killing
innovations. I cannot imagine what would help here, other than working
_fast_ to let new cool things become widely available and reliable -
but we do it all the time!

Another problem specific to GHC (or maybe not, I don't trace evolution
of other implementations so closely) is lack of binary compatibility
between - in practice - any two versions of GHC. I imagine that if I
told some person that libraries need recompilation with each release
of the compiler, he would only laugh and it would be enough to not
use the language for serious work. Again I don't know what would help
here - evolution of physical representation of things is necessary,
and Haskell has a richer interface of the RTS than C. AFAIK C++
has some similar problems here, e.g. gcc is planning to introduce a
new name mangling scheme where the old one was not able to express
everything needed for ANSI C++, but C does not have such problems on
the language level and is almost exclusively used on Unices.

The last problem would be not so big if releases were source
compatible. But they are not, thanks to the first problem.

So generally there is a conflict between making the language better
and keeping it compatible. Probably indeed the only solution for now
is to ensure that libraries are being ported to current tools and
that their state wrt. this process is clearly marked. But IMHO it
would be more valuable to invent ways to make this less necessary.


The second thing that worries me is that Haskell seems to be poorly
scalable. I once wanted to compare the speed of some operations on
finite maps with C++, hoping that laziness would help because many
parts of the resulting map were not needed later. The result was
a stack overflow in Haskell when maps were sufficiently large. C++
standard library was able to process much larger maps.

I don't like that GHC forces to specify the heap size if it's bigger
than a fixed size of 64MB.

As much as I love the simplicity of Haskell's String, I am worried
that it takes several times more memory than raw information.
Using PackedStrings is not a perfect solution - it makes the style of
handling strings very different. Plain Strings so well fit the lazy
paradigm. And although C++'s string is a separate type, vector<char>
would be good enough too, the natural generic container is efficient -
a shame that C++ makes this better.

Maybe improving arrays would help for the last thing. It's already
getting better by the unified interface of IArray/MArray in GHC.
It does not provide a natural interface for alternative strings -
these arrays have only 0-based indices (or 1-based in some languages,
but certainly not arbitrarily based). Introducing yet another specific
array interface would be bad just because of introducing another
specific interface.

I am trying to design a unified collection interface, starting with
what Edison provides, but based on classes instead of modules.
It would introduce 0-based arrays nicely and make it possible to
switch the representation from making 0-based arrays out of arbitrarily
indexed (by any Ix type) to the reverse. A problem with this interface
(other than it looks more complicated than I hoped) is that it does
work only in an overclocked version of Hugs (the original runs out
of memory during typechecking). GHC has about two problems with
non-standard classes there, all important: one I have reported and
did not see the definite answer (method contexts constraining only
type variables of the class head), the other is not fully implemented
functional dependencies. So again I was hit by a cause of the first
problem, that language extensions being too fresh and soft.


The third problem is the lack of libraries, especially interfaces
to important C libraries like GUIs or operating system features
or cryptography.

I'm afraid that people will not make many libraries when faced with
the first problem. Also the FFI and the infrastructure of making custom
packages which are easy to install and use are still evolving. IMHO the
lack of libraries is a consequence of other problems. More important
than maintaining information about existing libraries is to make
an environment where new libraries could grow and be usable for a
long time.

-- 
 __("<  Marcin Kowalczyk * [EMAIL PROTECTED] http://qrczak.ids.net.pl/
 \__/            GCS/M d- s+:-- a23 C+++$ UL++>++++$ P+++ L++>++++$ E-
  ^^                W++ N+++ o? K? w(---) O? M- V? PS-- PE++ Y? PGP+ t
QRCZAK                5? X- R tv-- b+>++ DI D- G+ e>++++ h! r--%>++ y-


Reply via email to