>> Can anybody answer these questions directly? Bonus points if you can do it >> without being condescending.
>> 1. What does the value represented by sv mean now? Value of "sv" shows how many milliseconds server simulation step took on the last networked frame. >> 2. What does the +- next to the sv represent? Value following sv +- shows standard deviation of server simulation step duration measured in milliseconds over the history of last 50 server frames. >> 3. What does the current value for var represent? Value for sv var when server performance is meeting tickrate requirements represents the standard deviation of accuracy of server OS nanosleep function measured in microseconds over the history of last 50 server frames. The latest update relies on it for efficiently sleeping and waking up to start next frame simulation. Should usually be fractions of milliseconds. Value for client var near fps net graph display is showing standard deviation of client framerate measured in milliseconds over the history of last 1000 client frames. By using fps_max to restrict client rendering to maintain a consistent fps client can have framerate variability at a very low value, but keep in mind that system processes and 3rd party software can influence framerate variability as well. >> 4. Originally, it was considered respectable to have a var of less than 1, >> reasonable to have it spike as high as 2, but pretty much horrible to have a >> variance remain above 2 for any length of time. What would be the equivalent >> values for the three new measurements (sv, +-, and var)? For a 64-tick server as long as sv value stays mostly below 15.625 ms the server is meeting 64-tick rate requirements correctly. For a 128-tick server as long as sv value stays mostly below 7.8 ms the server is meeting 128-tick rate requirements correctly. If standard deviation of frame start accuracy exceeds fractions of millisecond then the server OS has lower sleep accuracy and you might want to keep sv simulation duration within the max duration minus OS sleep precision (e.g. for a 64-tick Windows server with sleep accuracy variation of 1.5 ms you might want to make sure that server simulation doesn't take longer than 15.625 minus 1.5 ~= 14 ms to ensure best experience). Hope this helps, -Vitaliy
_______________________________________________ Csgo_servers mailing list [email protected] https://list.valvesoftware.com/cgi-bin/mailman/listinfo/csgo_servers
