A while back on the cod mailing list it was noticed that cod uses a 'weird' way of getting it's processor cycles, taking them from in between the ticks of your OS. I am not a programmer (at least not to the extent of understanding a thing like this fully), so I do not comprehend how or why this is done. However, I do know that the values you see for you COD2 server, in things like 'top' are not the _real_ amount of resources your server is using. In the past I have witnessed this behavior first hand as well. So not only this is a reason to say the comparison is not really valid, but also something for you to consider when running (a) COD2 server(s).
As for the srcds server using up a lot of resources; I'd say it's the same as for the client part to games, isn't it? In a year, today's most top-notch system, will be outdated again as well. That's just the way it works. Probably with things like HDR and more entities and other things for the server to compute being added, the server just needs more 'power'.. Don't get me wrong.. I don't like this fact either, but it is just the way of the world I suppose. --- Regime Evaldas Zilinskas wrote:
Try runing a DODs server, where maps have more corners, houses, HDR t.t... 24slot dod_donner, 66tick can't even run on pentium 3ghz :). (something like css with de_inferno) fps are below 30.. the code needs to be optimised! look at COD2. 24slot server runing COD2(DM) use ~20-30% of same 3ghz pentium.
_______________________________________________ To unsubscribe, edit your list preferences, or view the list archives, please visit: http://list.valvesoftware.com/mailman/listinfo/hlds_linux

