scott brown wrote:
--
[ Picked text/plain from multipart/alternative ]
I'm confused. Why do we care what FPS the server gets?

Becuase ticks happen regardless of frames.

CHANGES can only be proccessed during a frame, but THE WORLD changes
ONCE in ONE tick. So, if you have more frames than ticks, lots of
changes can happen in one tick. If you have less frames than ticks, some
changes will never happen (READ: bad bullet registration).

The above is critical, in that if the above is not adhered to, your
server is garunteed mathematically faulty, provable in Turing.

What's equally important though is client latency. You only recieve a
world update on a tick, i.e. the server sends clients a new picture of
the world every tick. Clients send data also in ticks, but not on the
same time-line. Clients are always behind, ALWAYS. The more frames your
server processes the faster it can process client responses, which
actually cuts down on latency significantly. As any Source player will
know, it's easy to die in sub second blocks, if the server is waiting
another 90ms or so to process your incoming data because that's how long
it is to the next frame it's quite upsetting (you can see it on the
client).

The max fps for a
server dosn't affect the clients max fps. Why should I care if my server
console pulls 60 or 300 fps? Wouldn't forcing my console window to 300 fps
waste server resources that could be use better by my clients? I'm not
flamming, I really want to know.
 I've been running h-l servers for 5 years and always pull 60-70 fps on any
server config on our rental box.
--

_______________________________________________
To unsubscribe, edit your list preferences, or view the list archives, please 
visit:
http://list.valvesoftware.com/mailman/listinfo/hlds



_______________________________________________
To unsubscribe, edit your list preferences, or view the list archives, please 
visit:
http://list.valvesoftware.com/mailman/listinfo/hlds

Reply via email to