Re: Precision vs. resolution
John Cowan wrote on 2006-05-24 14:25 UTC: > Can someone lay out for me exactly what the difference is between > clock precision and clock resolution? I've read the NTP FAQ and > several other pages but am more confused than ever. I do not think there exists a single universally recognized formal international standard that defines the exact meaning of these terms. I have seen many slightly contradicting definitions of these terms used by different authors. Therefore, whenever you use terms, best attach a brief definition yourself to clarify what exactly you mean. Markus -- Markus Kuhn, Computer Laboratory, University of Cambridge http://www.cl.cam.ac.uk/~mgk25/ || CB3 0FD, Great Britain
Re: Precision vs. resolution
> I should perhaps explain that I was interested in an internal > representation for durations, which I am now representing as a triple of > months, minutes, and seconds (the number of minutes in a month is not > predictable, nor the number of seconds in a minute given leap seconds, > but all other relationships are predictable: 10080 minutes/week, 12 > months/year, 100 years/century, etc.) To this I would add a fourth I'm curious what motivated this particular representation given that one could perhaps also use years/days/seconds, or weeks/hours/seconds, or months/hours/seconds, or like GPS, weeks/seconds, or like MJD, days/fractions... > nonnegative integer representing "clock resolution units" and wanted to > make sure I had the terminology correct. Take this with a grain of salt since I'm still confused by resolution vs. granularity myself, but isn't the phrase "clock resolution units" redundant? If resolution is about the minimum unit of measure it would seem the phrase "clock resolution" is sufficient, no? Further, if you are counting clock cycles to subdivide integer seconds, then consider words like clock rate, or clock period, or clock granularity, or clock cycles, or just plain clock tick. That gives you "months/minutes/seconds/ticks" which, to me at least, sounds better than "months/minutes/seconds/clock resolution units". If your clock resolution is ms or us or ns then it's even simpler, e.g., "months/minutes/seconds/ns". This representation would accommodate, for example, a typical 1.193 MHz clock -- since the resolution is 1 ns while the granularity is 838 ns. /tvb
Re: Precision vs. resolution
Rob Seaman scripsit: > Interesting question. Perhaps it is the distinction between > addressability and physical pixels that one encounters on image > displays and hardcopy devices? (Still have to posit which is which > in that case :-) Thanks to those who responded either publicly or privately. In summary, "infinite are the arguments of mages". Some take resolution to be a near-synonym for precision, some take it to be a synonym for granularity. The more definitive the source, the vaguer the definitions. I should perhaps explain that I was interested in an internal representation for durations, which I am now representing as a triple of months, minutes, and seconds (the number of minutes in a month is not predictable, nor the number of seconds in a minute given leap seconds, but all other relationships are predictable: 10080 minutes/week, 12 months/year, 100 years/century, etc.) To this I would add a fourth nonnegative integer representing "clock resolution units" and wanted to make sure I had the terminology correct. Ah well. -- John Cowan [EMAIL PROTECTED] http://ccil.org/~cowan In the sciences, we are now uniquely privileged to sit side by side with the giants on whose shoulders we stand. --Gerald Holton
Re: Precision vs. resolution
> Can someone lay out for me exactly what the difference is between > clock precision and clock resolution? I've read the NTP FAQ and > several other pages but am more confused than ever. > > (I do understand the distinction between precision and accuracy: > 3.1429493 is \pi precise to 8 significant digits, but accurate > only to 3.) Key words for clocks and clock measurement include: precision, accuracy, stability, resolution, and granularity. Here's an informal stab at definitions for you: Accuracy is how close a clock is to an accepted standard. Precision relates to how consistent measurements are. Stability refers to how well a clock keeps time over time. Resolution is how precise one can measure. Granularity is the minimum [digital] increment of one reading to another. /tvb http://www.leapsecond.com/time-nuts.htm
Re: Precision vs. resolution
On May 24, 2006, at 7:25 AM, John Cowan wrote: Can someone lay out for me exactly what the difference is between clock precision and clock resolution? Interesting question. Perhaps it is the distinction between addressability and physical pixels that one encounters on image displays and hardcopy devices? (Still have to posit which is which in that case :-) You might have more luck directing this question to time-nuts (http://www.febo.com/pipermail/time-nuts) or perhaps the NTP WG ([EMAIL PROTECTED]) - although one would be delighted to find this list capable of generating a knowledgeable response to any such clearly expressed neutral question :-) Rob NOAO
Precision vs. resolution
Can someone lay out for me exactly what the difference is between clock precision and clock resolution? I've read the NTP FAQ and several other pages but am more confused than ever. (I do understand the distinction between precision and accuracy: 3.1429493 is \pi precise to 8 significant digits, but accurate only to 3.) Thanks. -- Values of beeta will give rise to dom! John Cowan (5th/6th edition 'mv' said this if you triedhttp://www.ccil.org/~cowan to rename '.' or '..' entries; see [EMAIL PROTECTED] http://cm.bell-labs.com/cm/cs/who/dmr/odd.html)