On 2005-12-07, Fredrik Lundh <[EMAIL PROTECTED]> wrote: > Peter Hansen wrote: > >> Going by memory, Linux will generally be 1ms resolution (I might be >> off by 10 there...), while Windows XP has about 64 ticks per second, >> so .015625 resolution... > > here's a silly little script that measures the difference between > two distinct return values, and reports the maximum frequency > it has seen this far: > > import time > > def test(func): > mm = 0 > t0 = func() > while 1: > t1 = func() > if t0 != t1: > m = max(1 / (t1 - t0), mm) > if m != mm: > print m > mm = m > t0 = t1 > > test(time.time) > # test(time.clock) > > if I run this on the Windows 2K box I'm sitting at right now, it settles > at 100 for time.time, and 1789772 for time.clock. on linux, I get 100 > for time.clock instead, and 262144 for time.time.
At least under Linux, I suspect you're just measuring loop time rather than the granularity of the time measurement. I don't know which library call the time modules uses, but if it's gettimeofday(), that is limited to 1us resolution. clock_gettime() provides an API with 1ns resolution. Not sure what the actual data resolution is... -- Grant Edwards grante Yow! Yow! I just went at below the poverty line! visi.com -- http://mail.python.org/mailman/listinfo/python-list