On 1 November 2016 at 13:44, Giliad Wilf
<000000d50942efa9-dmarc-requ...@listserv.ua.edu> wrote:
> On Tue, 1 Nov 2016 13:03:27 -0400, Tony Harminc <t...@harminc.net> wrote:

>>I'm a little confused about what kind of units "1.4 milliseconds a day
>>per century" would be in.
>>
>>Tony H.
>>
> This means that  every 100 years, the day gets about 1.4 thousandths of a 
> second
> longer, compared to the length of the day measured the moment atomic clocks
> became available commercially, at 1957, and since then, 36 leap seconds were
> counted.

Sure - I understand what's going on. It's just that, typically, one
can "see" the nature of the units involved in such a statement. Often
enough, when some politician or news reporter makes a statement like
"Ontario exported 2.5 GW of electricity last year", or "an electric
kettle uses about 1.5 kWh", the meaninglessness jumps right out
because the units make no sense in the context.

And in the familiar case where the numerator is in a length unit (say,
m), and the denominator in a time unit (say, s), we have names for
each level: distance, speed, acceleration, jerk.

In this case we have time/time/time, which just fails to jump out at
me. Maybe my imagination, visual or otherwise, is lacking.

> The epoch being used in computer systems is 1972, with 26 leap seconds 
> counted since then.

Not any computer systems I work with. They use either 1900 or 1970 as
their epoch. What uses 1972?

Tony H.

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN

Reply via email to