On Fri, 21 Apr 2006 15:48:02 -0400, John S. Giltner, Jr. 
<[EMAIL PROTECTED]> wrote:

>We can't tell you.  This is dependent on the system that they are using.
>...

It also depends on how the time is set on the system at hand.
If the time is manually set then all bets are off.  The most accurate
clock in the world can be set wrong.  The values are meaningless except
for the intervals between displayed times.

If the clock synchronizes with a strandard time source (and takes into
account the propogation delay of the standard signal) then the time will
be very accurate right after synchronization but will drift according to
the accuracy of the device's local clock.  Intervals will be as accurate
as the local clock ... except across a resynchronization event.

Pat O'Keefe 

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [EMAIL PROTECTED] with the message: GET IBM-MAIN INFO
Search the archives at http://bama.ua.edu/archives/ibm-main.html

Reply via email to