On Tue, 03 Jul 2012 11:23:51 +0200, Casper Bang <[email protected]> wrote:

Where I work, we're used to dealing with time-series data and this really
sounds like developers making the mistake to assume that there are always
60 seconds in a minute. Working with time is not as trivial as it sounds,
once you start rolling up data views and interpolating within and between
steps. In other words, I doubt if Java or the JVM is directly to blame
here, sounds more like erroneous assumptions - but detail on the matter is
limited so far.

If you read around, Google blogged a few weeks ago that their strategy is to slow down the company NTP clock so that at the end of the day with the leap second they are still in sync with the global time, but their systems never saw the leap second:

http://www.ciol.com/News/News-Reports/Leap-Second-How-Google-saved-its-websites/164026/0/

This sounds as an admission that there's no safe code around to deal with leap seconds :-)


Also, that paper referring to the "experience of 2005" perhaps confirms that the problem is not new, but there were less coverage of the news a few years ago.



--
Fabrizio Giudici - Java Architect, Project Manager
Tidalwave s.a.s. - "We make Java work. Everywhere."
[email protected]
http://tidalwave.it - http://fabriziogiudici.it

--
You received this message because you are subscribed to the Google Groups "Java 
Posse" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/javaposse?hl=en.

Reply via email to