On 02.05.2011 13:02, Andres Valloud wrote:
Ok, so each time you add the offset things become more and more imprecise...

http://msdn.microsoft.com/en-us/library/dd757629%28v=vs.85%29.aspx

"The default precision of the timeGetTime function can be five milliseconds or more, depending on the machine. You can use the timeBeginPeriod and timeEndPeriod functions to increase the precision of timeGetTime. If you do so, the minimum difference between successive values returned by timeGetTime can be as large as the minimum period value set using timeBeginPeriod and timeEndPeriod."

Note you cannot even assume what the precision is because the function timeBeginPeriod() changes the precision for EVERY program (plus it has other side effects). Of course, then the docs state

"Use the QueryPerformanceCounter and QueryPerformanceFrequency functions to measure short time intervals at a high resolution"

even though the relevant specs mention all sorts of problems when one uses QueryPerformanceCounter(). This is typical MSDN, unfortunately.
The way I read it, each _call_ to timeGetTime should be accurate to within 5 milliseconds.
The current time is calculated
baseTime + (newTimeGetTime - baseTimeGetTime) , so precisions not adding up does not explain drifts of 36 seconds...

Then when timeGetTime wraps, new baseTime + baseTimeGetTime is obtained.

Cheers,
Henry

Reply via email to