It seems to me that the major problem with the leap-second is the inability of current computer operating systems to represent it, and this is due to their using a second count since 1970 rather than writing it out as we would by hand. While it doubtless made sense in the days of floppy discs to squeeze tha date and time into a single 4-byte number, with modern communication speeds and storage media capacities, that no longer seems to be a requirement. The (numerical) date and time could be packed into 24 ASCII characters, 12 if BCD was used. Would it not make sense now for the next generation of operating systems to do that? Yes, those who need to find the elapsed time between two time-stamps would still have a problem, but isn't the overwhelming major requirement just to represent the date/time, and be able to easily show if one timestamp is before or after another?
Peter _______________________________________________ time-nuts mailing list -- [email protected] To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts and follow the instructions there.
