The TIME() function is supposed to return the number of seconds since midnight, 
in whole seconds on unix, but on Windows machines it returns a real number 
implying it is accurate to the nearest .001 second.

Of course, it doesn't, but I noticed on my server that it rather reliably 
returns a number rounded to the nearest .015 or .016, giving about 64 divisions 
per second.

Back in the old days, the PC's time chip had a periodicity of .054 seconds. 
Does anyone know how Universe manages to tic every 16 milliseconds? It seems 
somewhat dependant on machine load. With my head buried in programming, I 
suppose I might have missed common hardware improvements <g>.

My main concern is how reliable is it? In other words, under heavy load can you 
miss a tic? On the old fashioned hardware driven interrupt model, you would 
never miss a tic no matter what the load. If it is reliable, it's too bad we 
can't write interrupt handlers in Universe. That would be cool.

Barry Brevik
-------
u2-users mailing list
[email protected]
To unsubscribe please visit http://listserver.u2ug.org/

Reply via email to