On Wed, Apr 18, 2001 at 05:38:02PM -0500, Travis Watkins wrote:
>
> I fear that this may be a bug:
>
> Integer? 10000000005
> false (this is correct because you only use 4 bytes to represent an integer)
>
> integer? 2000000001
> true
>
> integer? (10000000005 / 5)
> false
> (note, this is 2000000001, the value which was true before)
Actually this is correct behavior. 10000000005 is a decimal! because it is
too large to fit into an integer!. Dividing a decimal! by an integer! results
in another decimal!.
--
Holger Kruse
[EMAIL PROTECTED]
--
To unsubscribe from this list, please send an email to
[EMAIL PROTECTED] with "unsubscribe" in the
subject, without the quotes.