Dennis Sweeney <sweeney.dennis...@gmail.com> added the comment:

I reproduced something similar on Python 3.9.0b1, Windows 64-bit version:

    py -m pyperf timeit -s "import threading; E = threading.Event()" 
"E.wait(<NUMBER>)"

NUMBER            Mean +- std dev
-------------------------------------------
0.0               5.79 us +- 0.13 us
0.000000000001    15.6 ms +- 0.1 ms
0.001             15.6 ms +- 0.1 ms
0.01              15.6 ms +- 0.6 ms
0.013             15.5 ms +- 0.6 ms
0.015             15.9 ms +- 0.9 ms
0.016             25.2 ms +- 0.5 ms
0.017             31.2 ms +- 0.2 ms
0.018             31.2 ms +- 0.4 ms
0.025             31.2 ms +- 1.0 ms
0.05              62.2 ms +- 0.8 ms
0.1               109 ms +- 2 ms
0.2               201 ms +- 3 ms
0.5               500 ms +- 0 ms
1.0               1.00 sec +- 0.00 sec

On the smaller scale, it looks quantized to multiples of ~15ms (?), but then it 
gets more accurate as the times get larger. I don't think it's a measurement 
error since the first measurement manages microseconds.

Perhaps this is just an OS-level thread-scheduling issue?

----------
nosy: +Dennis Sweeney

_______________________________________
Python tracker <rep...@bugs.python.org>
<https://bugs.python.org/issue41299>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to