Re: Primitive types and Prelude shenanigans

2001-02-21 Thread Fergus Henderson

On 21-Feb-2001, Marcin 'Qrczak' Kowalczyk [EMAIL PROTECTED] wrote:
 Wed, 21 Feb 2001 12:55:37 +1100, Fergus Henderson [EMAIL PROTECTED] pisze:
 
  The documentation in the Haskell report does not say what
  `fromInteger' should do for `Int', but the Hugs behaviour definitely
  seems preferable, IMHO.
 
 Sometimes yes. But for playing with Word8, Int8, CChar etc. it's
 sometimes needed to just cast bits without overflow checking, to
 convert between "signed bytes" and "unsigned bytes".

Both are desirable in different situations.  But if you want to ignore
overflow, you should have to say so explicitly.  `fromInteger' is
implicitly applied to literals, and implicit truncation is dangerous,
so `fromInteger' should not truncate.

There should be a different function for conversions that silently
truncate.  You can implement such a function yourself, of course,
e.g. as follows:

trunc :: (Bounded a, Integral a) = Integer - a
trunc x = res
   where min, max, size, modulus, result :: Integer
 min = toInteger (minBound `asTypeOf` res)
 max = toInteger (maxBound `asTypeOf` res)
 size = max - min + 1
 modulus = x `mod` size
 result = if modulus  max then modulus - size else modulus
 res = fromInteger result

But it is probably worth including something like this in the standard
library, perhaps as a type class method.

-- 
Fergus Henderson [EMAIL PROTECTED]  |  "I have always known that the pursuit
|  of excellence is a lethal habit"
WWW: http://www.cs.mu.oz.au/~fjh  | -- the last words of T. S. Garp.

___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe



Re: Inferring from context declarations

2001-02-21 Thread D. Tweed

George Russell wrote:
 
 (3) Simon Peyton Jones' comments about dictionary passing are a red herring,
 since they assume a particular form of compiler.  Various (MLj, MLton)
 ML compilers already inline out all polymorphism. Some C++ compilers/linkers
 do it in a rather crude way as well, for templates.  If you can do it,
 you can forget about dictionary passing.

[Standard disclaimer: I write prototype code that's never `finished' to
ever-changing specs in a university environment; other people probably
view things differently.]

I'm not sure I'd agree about this. Note that there's two levels, inlining
polymorphic functions at the call site and `instantiating polymorphic
functions at each usage type' without doing the inlining. C++ compilers
have to at least do the second because of the prevailing philosophy of
what templates are (i.e., that they're safer function-macros). Some of the
time this is what's wanted, but sometimes it imposes annoying compilation
issues (the source code of the polymorphic function has to be available
everytime you want to use the function on a new class, even if its not
time critical, which isn't the case for Haskell). I also often
write/generate very large polymorphic functions that in an ideal world
(where compilers are can do _serious, serious_ magic) I'd prefer to work
using something similar to a dictionary passing implementation. I'd argue
that keeping flexibility about polymorphic function implementation (which
assumes some default but can be overridden by the programmer) in Haskell
compilers is a Good Thing.

Given that, unless computing hardware really revolutionises, the
`speed/memory' profile of todays desktop PC is going to recurr in wearable
computers/PDAs/etc I believe that in 20 years time we'll still be figuring
out the same trade-offs, and so need to keep flexibility.

___cheers,_dave
www.cs.bris.ac.uk/~tweed/pi.htm|tweed's law:  however many computers
email: [EMAIL PROTECTED] |you have, half your time is spent
work tel: (0117) 954-5250  |waiting for compilations to finish.



___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe