Tue, 13 Jun 2000 19:57:18 +0200, Lennart Augustsson <[EMAIL PROTECTED]> pisze:

> >         chr :: Int -> Char
> >         chr (I# i) | i >=# 0# && i <=# 255# = C# (chr# i)
> >                    | otherwise = error ("Prelude.chr: bad argument")
> 
> So a Haskell program that (perhaps inadvertently) uses a Unicode character
> will fail with ghc.

...not only because of definition of chr, unfortunately.

> When is ghc going to implement full Haskell? :)

Simon Marlow said on 16 May:

> I agree it should be done.  But not for 4.07; we can start breaking
> the tree as soon as I've forked the 4.07 branch though (hopefully
> today...).

The branch was forked a long time ago. So could we pleeease...?

A question remained unanswered. The report requires Char to be exactly
16 bits. I loudly cry for letting it be >= 16 bits and <= Int, and
implementing 31 or 32 in GHC. But there is a problem:

Oops, ord will have to be allowed to return negative numbers when
the size of Char is equal to the size of Int. Another solution is to
make Char at least one bit less than Int, or also at the same time
no larger than 31 bits. ISO-10646 describes the space of 31 bits,
UTF-8 is able to encode up to 31 bits, so then a UTF-8 encoder would
be portable without worrying about Char values that don't fit, and
a decoder could easily check if a character is representable in Char:
ord maxBound would be guaranteed to be positive.

Choices I see:
- 30 <= Int, 16 <= Char <= 31, Char <  Int
- 30 <= Int, 16 <= Char,       Char <  Int
- 30 <= Int, 16 <= Char,       Char <= Int

-- 
 __("<    Marcin Kowalczyk * [EMAIL PROTECTED] http://qrczak.ids.net.pl/
 \__/              GCS/M d- s+:-- a23 C+++$ UL++>++++$ P+++ L++>++++$ E-
  ^^                  W++ N+++ o? K? w(---) O? M- V? PS-- PE++ Y? PGP+ t
QRCZAK                  5? X- R tv-- b+>++ DI D- G+ e>++++ h! r--%>++ y-


Reply via email to