Darren Duncan wrote:
Dave Rolsky wrote:
It's not necessary to store each unit internally in order to get everything right, and not doing so makes some things a lot easier (though it makes other things harder ;)

I prefer to make value representation simpler at the possible cost of calculation, focusing on being able to encode accurately what you know or don't know or mean to say or don't mean to say. Using a representation in terms of an epoch is going more the other way. With my preference, you don't need to do any calculations with lookup tables etc in order to know what you have in human understandable terms; rather you just need that to derive something else relative to where you are.

Well, I can see reasons for both ways, but I prefer the more split apart way, and that's also what SQL and some other data-focused languages and some other languages do, not to mention what people do, which I think is a good precedent.

Replying to myself here ...

One further advantage of going my way of a calendar based rather than the epoch based representation is that it handles reasonable future dates correctly.

For example, if a user wanted to specify the date "2025 March 3rd at 15:30:00h", then storing it as [2025,3,3,15,30,0] will still mean exactly that date and time later, what the user intended.

Whereas, if we go with the epoch approach, then we have to encode that involving a count of seconds from an epoch, but we don't know how many leap seconds we need to have, since those are decided semi-arbitrarily; it may be that when 2024 rolls around and our epoch-using code is keeping that same count-of-seconds, then suddenly the value has changed to, say, 15:29:55h or something.

It comes down to what is more important, preserving positions in time DWIM as the user intended, or preserving a particular delta of time from now.

I say leave the complexity to when you explicitly do date-add or date-diff etc and keep the representation simple.

Users expect that there might be subtle variations or roundoffs when you calculate something new (eg, rounding off 2/3 expressed in decimal or binary notation), but users expect that a conceptual value they explicitly set doesn't change.

Of course, I'm thinking from a database language point of view, being concerned with value consistency over time, but general purpose languages should typically have a lot of the same priorities and sensibilities.

-- Darren Duncan

Reply via email to