Seems we have interested admins on this topic so I have more variables
to consider.
I have my server.cfg max_fps set to 999 and I run the srcdsbooster
resulting in the server console displaying 500fps +/- 25.
Using the srcdsbooster provides the performance needed to keep the
players in the server without setting the tickrate.
Setting the tickrate higher resulted in my not being able to run 10
small servers (16 slots or less).
(Note that I am not claiming a capability to serve 160 players on a dual
xeon, about 100 is the max IMnsHO)

I recall Alfred saying something about the console fps value actually
representing the simulations per second the server is running.
If so then it seems logical that the higher the server fps the more
players can be served a good solid and accurate update rate.

On the client side, I have 3 worksations I have run to test server
responsiveness. One is a thinkpad notebook i use for worst case scenario
analysis (suxors bad), a Dell P4 2.4ghz w/radeon 9800 pro for mid range
testing, and a custom built box using the dual sli technology with 2
nVidea 6800 cards for top-end testing.

The difference between the low end and high end is obvious. The low end
thinkpad is still playable on my servers even with a lousy 40 to 50 fps.
The dual sli, on the other hand, gets superior fps due to the video
setup but the update rates are also much higher. Therefore it seems the
processing power on the client has a lot to do with this topic.

So what will be the optimal settings to tweak the maximum performance
from both the server and client?
I have studied the data rates and have proven the high data rates serves
no purpose since the clients never request so much data.
I have found that the maxrate serves well at 8192. The effect is a more
stable running server and the data transfer is more predictable. The
servers have the players and when I ask how the performance is I never
hear any negative lag remarks.
I have the update rate set at 100 and the minupdaterate set to 45. The
minupdaterate seems to be the factor that makes a lot of difference
since it appears to make the clients update more frequently.

All constructive dialog and discussion is most welcome on this topic.

Many thanks....
Steve



Whisper wrote:

--
[ Picked text/plain from multipart/alternative ]
Hi guys
What is the most updates a second you can get a client to receive?
Even with the following server & client settings we can only seem to manage
66-75 updates a second according to net_graph 3
*Server*
sv_maxrate 20000
sv_maxupdaterate 150
tickrate 100
fps_max 300
18 players
*Client*
rate 20000
cl_rate 15000
cl_cmdrate 101
cl_updaterate 101
v-sync disabled
The Server is only 8 hops and 40 Kilometers physically away from me and the
actual wire distance would not be much further and with a consistent >20ms
ping
I am trying to work out how to get our servers to the magic 100 updates a
second so we can serve out data at a rate that matches the normal 100 fps
limit enforced on the client side, unfortunately I cannot get above 75
updates a second on the client side.
What else do I have to change?
Is it even possible?
If it is possible what settings are you using?
Does tickrate on one SRCDS process affect the tickrate on another SRCDS
process on the same box like the old HLDS sys_ticrate does?
Thanks
--

_______________________________________________
To unsubscribe, edit your list preferences, or view the list archives, please 
visit:
http://list.valvesoftware.com/mailman/listinfo/hlds







_______________________________________________
To unsubscribe, edit your list preferences, or view the list archives, please 
visit:
http://list.valvesoftware.com/mailman/listinfo/hlds

Reply via email to