On Aug 13, 2005, at 5:00 PM, Greg Hennessy wrote:

Well, even though they are predictable, lots of computer programs still got them wrong.

I'm not interested in disputing this statement, rather, we've often heard about various problems implementing and interpreting temporal algorithms and interfaces.  Assume this is true.  What is different about the handling of time from all other algorithms and interfaces imposed on computing?  Certainly in scientific computing there is nothing particularly special about time handling compared to lots of other pretty fancy algorithms and complex interfaces.  

But surely this applies to general purpose computing as well - and not just the subtleties of operating systems, either.  If a typical programmer can be expected to master hash tables, sorting algorithms (or anything else out of Knuth), the ridiculously "rich" class libraries of Java, public key cryptography, pixel and other spatial transforms, bit and byte operators - why not leap seconds?

I guess I'm a little unclear.  Nobody suggests that we dumb down J2EE or pretend that race conditions don't exist.  So why are leap seconds poison?

Rob Seaman
National Optical Astronomy Observatory

Reply via email to