pre-script (probably not an actual term, heh. Probably should be
'foreword.'): I've rewritten this about 5 times now, trying to come at
it from different angles each time, heh.  I still am not sure if it
comes across very well...

well, I don't know if I'm grossly misunderstanding something or if you
are.  The server takes client input and calculates the next tick. Then
it sends that tick's data to clients.  The fact that there are 33
ticks per second doesn't mean the server is taking 1/33 of a second to
calculate a tick.  The amount of time it takes to handle that player
input doesn't mean the next tick is delayed by that much.  The ticks
happen at a fixed rate.  Not an approximate one.  Not a loosely
governed one.  They happen at regular intervals.  This does not mean
that it takes 1/33 of a second to calculate them.  This only means
that 33 of them occur per second.  It might only take 1/333 of a
second to calculate it.  The rest of the time is spent doing
absolutely nothing.  Again, the server fps tells you how quickly the
code that handles input is running.  If it is running faster than the
tickrate, then this means that the tick calculations are not being
starved for data.  This means everything is happening fast enough for
the gameworld to continue onward without incident.  This is why Alfred
(or if not Alfred, it was someone from Valve anyways) has said in the
past that as long as your server fps is higher than your tickrate
everything is hunky dory.  How much higher it is is irrelevant, since
as long as it is higher the gamestate calculations aren't being
starved.  As server fps gets higher and higher it only means that
there is a lot more idle time spent doing absolutely nothing.
Gamestate is king.  Nothing happens faster or more often than the
gamestate rate, since the only thing happening is that gamestate
progression.

On 8/9/05, James Tucker <[EMAIL PROTECTED]> wrote:
> No I didn't, I said it is capable of processing a piece of client data
> in 1/500 seconds, as opposed to 1/33 seconds (the fact that it rewinds
> by cl_interp+latency and processes that tick is irrelevant for this
> discussion). When you do the calculations of the timeline of
> processing of packets vs. gameworld progression (tick's passing) you
> will find that server side FPS has a major impact on PROCESSING
> LATENCY. This has NOTHING TO DO WITH CLIENT LATENCY. Interpolation is
> not important here, it is merely affected, and moreso when a server is
> running at a low FPS. The REASON is simple - at 33 fps the server
> takes - 1 whole tick to process a frame - next update will happen in a
> minimum of 1 tick's time (no sub-tick udpates (client re-play will
> occur for corrected ticks) and so on. Forget about the gameworld rate
> - this is about how fast the server processes data WHEN IT HAS ARRIVED
> to WHEN IT LEAVES (the server). On a server that is only managing to
> process one frame per tick (or marginally worse as is common when fps
> is so low (typicall a machine which can manage 30-35 fps will only run
> at 30-31, however I have neither the time nor inclanation to explain
> that one) will send responses 1 tick later than optimal as the next
> update after a processed command will be 2 ticks after it arrived.
> Arrive mid-tick - tick completes, next frame makes new tick AND
> processes client data (can it do both in one frame?), data into queue,
> next tick - send.
>
> I hope this is more clear. :-)
>
> Client side interpolation is affected if the updaterate is too low or
> more importantly the real packetrate. (As updaterate is merely a
> desired 'boot-time' variable, whereas packetrate is an observed
> variable of the run-time system). at cl_interp 0.1 the above scenario
> cuts a VERY fine line and will cause an extrapolation scenario
> relatively frequently.


--
Clayton Macleod

_______________________________________________
To unsubscribe, edit your list preferences, or view the list archives, please 
visit:
http://list.valvesoftware.com/mailman/listinfo/hlds

Reply via email to