L33TGaming wrote
> Do you understand var? Var is not the deviation from server tickrate
> rather
> deviation from the last timeframe. If your server tickrate drops to a
> sustained 64 from 128, your var isn't going to be 64.
> 
> When you get 128 FPS server or client wise, you're not getting 8ms per
> update, rather 128 frames as a whole. Some ticks come in at different
> times
> and thus jitter between the frames. This is what var measures. See below
> to
> definition.
> 
> Definitions:
> The "sv" tag shows the fps of the server as of the latest networking 
> update delivered to the client.
>  The "var" shows the standard deviation of the server's frametime (where
> server fps = 1.0 / frametime) over the last 50 frames recorded by the
> server. If the server's framerate is below 20 fps, then this line will
> draw
> in yellow. If the server's framerate is below 10 fps, then this line will
> draw in red.

Quoted from: 
http://csgo-servers.1073505.n5.nabble.com/Server-performance-problems-tp6375p6769.html

His point is that you dont notice a difference between 0.8 and 1.8 var, that
is impossible.

I have already shared why you get less ticks than me, you need a  better CPU
as a said before on higher clocks, Ejziponken also pointed this out for you
guys 2 times already, seems people are pretty mad here...




--
View this message in context: 
http://csgo-servers.1073505.n5.nabble.com/Server-performance-problems-tp6375p6770.html
Sent from the CSGO_Servers mailing list archive at Nabble.com.

_______________________________________________
Csgo_servers mailing list
[email protected]
https://list.valvesoftware.com/cgi-bin/mailman/listinfo/csgo_servers

Reply via email to