On 2010-11-26, at 2:16 AM, John Cowan wrote:

> [email protected] scripsit:
> 
>> If fractions are allowed, why count milliseconds?
> 
> Given that it should be easy to implement most dates as fixed-size
> objects, milliseconds seem like a good compromise between range and
> precision.

Please don't count time using milliseconds.  It clutters my brain to have to 
remember a different unit of time than plain seconds.

Moreover, the choice of milliseconds, rather than microseconds or nanoseconds 
is purely an artifact of the current speed of computers.  If you chose 
milliseconds as the unit of time in the hope of getting better resolution using 
integers you'll probably say 2 years from now "milliseconds aren't precise 
enough for these fast CPUs, lets change the spec to use microseconds", and then 
20 years from now "darn! these CPUs have become fast! lets change the specs to 
use nanoseconds", etc.  Integers shouldn't be used for measuring time points 
because applications need different resolutions.

With a 64 bit float, you can represent a time interval of up to 3 months with a 
nanosecond resolution, and up to 266 years with a microsecond resolution.  I 
don't see any practical reason for wanting more than this.

Marc


_______________________________________________
Scheme-reports mailing list
[email protected]
http://lists.scheme-reports.org/cgi-bin/mailman/listinfo/scheme-reports

Reply via email to