blkraven, thank you for your response, I appreciate it, however my
request for information was sarcastic, and indeed in that sense rather
pointless to send to the list. I egged on the community to provide
information that is not entirely accurate. See my previous mail on this
thread for more information.

blkraven wrote:
A multi-media timer or mm-timer in short (do a google search on the
abbreviation) handles rate and precision at which data gets processed.

It's the NT Kernel timer, not a multi-media timer. There is an API
wrapper in one of the multi-media dll's. Everyone on forums etc calls it
this for no other reason than they know no better.

The normal timer isn't that precise because most programs don't need high
precision, programs that do need it such as voIP and streaming media change
this mm-timer.

Many voip apps do not use timer resolution to derive quantization and
sequencing timings, unlike many medium to high quality / video codecs,
sorry. If you'd have said "skype" or "divx" then sure.

For gaming it would be preferred to use a high precision timer for the
bullet registry.

Oh really!?!?  (Expression of shock, I was digging for bad info, but
this is the buscuit maker!)

Please, if you think you can justify this, feel free to try. You will
fail and make youself look like a fool. Keep this kind of imaginiation
to the public sections of script-kiddie forums please.

The primal cause of poor bullet registration on hlds is dropped frames
or dropped ticks. This occurs most commonly when the FPS drops below the
tickrate. At default kernel timer resolution (both on windows and linux)
on default tickrate, this should not happen.

The fact that server side processing latency drops massively as you
increase server side FPS is something that many of the people now
contributing to this discussion argued with me about fairly recently
(for some stupid reason), however I have verfied my statements. This may
result in what you percieve as "better bullet registration" due to a
decrease in processing latency, it is not unlikely that timing accuracy
will increase in general, both in client rendering and server rendering.
This is not a correction of model placement or bullet trajectory, but a
sync between the client and server platforms. Due to the nature of the
netcode design, there are few other ways to get as you say "poor bullet
registration". For more information regarding this (in particular, pay
attention to the fact that cmdpackets are timestamped, and that teh
system operates on-ticks only (discrete time) and so on and so forth)
check here:
http://developer.valvesoftware.com/wiki/Source_Multiplayer_Networking

Unfortunately this would raise the recommended system specs
for a game and the price u pay for renting a game server, so imo that's why
they leave it untouched.

Would it now, or is this un-tested speculation again?
Think about the way the system is bottlenecked, then think the above again.

I'm not sure what he means by 60hz.

I think he was referring to the default refresh rate of his monitor.
Just for clarity's sake:

THE DEFAULT TIMER RESOLUTION OF THE WINDOWS KERNEL IS 7.8ms.

FYI Hz means "many times per second" and is the SI unit for frequency
(repetition).

Just because I'm feeling generous, I'll drop another system internals
gem in here:
http://www.sysinternals.com/Information/bootini.html
do a search on that page for timer.

Many happy improvements,

james.

P.S. As you can see, I am starting to get a little bitter about all the
netcode and other configuration crap that surround the hldm and srcds
communities. Please please please please please stop producing volumes
of totally unprofessional, unscientific and generally only partially
true or useful information.

_______________________________________________
To unsubscribe, edit your list preferences, or view the list archives, please 
visit:
http://list.valvesoftware.com/mailman/listinfo/hlds

Reply via email to