--
[ Picked text/plain from multipart/alternative ]
Best I can anologize it as is this:

A 128 kbps MP3 file quality vs

CD quality resolution.

Think of the bullets being the 'sound' of the  file.  Less estimation [or 
compression], and more precision [of aim].  Server doesn't have to predict as 
much in 1 tick is my theory. There would have been less accumaltion of input 
from the clients to process in a shorter time span thus leading to smoother 
operation and a 'tight' gameplay feel.

-------------- Original message --------------

> --
> [ Picked text/plain from multipart/alternative ]
> Now Im more confused. I've rented a server box for 4 years now. Its a Power
> Edge p4 2.80 GHz, 1 Gig of ram and Win 2003 server edition out of a data
> center in Dallas, Texas operating on a network that is fully meshed and
> redundant with 10 backbone providers. I run a hl2dm, dods and hl1 natural
> selection mod, 3 different servers off the box. All the servers run great.
> All 3 of my console windows show 60 to 70 fps. Clients however, depending
> on their machine, can get well over 100 fps. As the server runs only dos
> windows I dont see how fps would affect the client or the servers
> preformance.
> FPS, Frames Per Second, being 60 or 300 how would this affect the server?
> As it runs no graphical representation of the game, just dos windows. I'm
> not uderstanding where fps of a server could affect its preformance.
> Please explain this to me as the more I know the better servers I can run.
> --
>
> _______________________________________________
> To unsubscribe, edit your list preferences, or view the list archives, please
> visit:
> http://list.valvesoftware.com/mailman/listinfo/hlds
--

_______________________________________________
To unsubscribe, edit your list preferences, or view the list archives, please 
visit:
http://list.valvesoftware.com/mailman/listinfo/hlds

Reply via email to