Christian Ohm schreef:
> What about (pseudocode)
> {
>       current_ticks = SDL_GetTicks()
>       frame_ticks = current_ticks - last_ticks;
>       actual_ticks = rate_ticks - frame_ticks;
>       if (actual_ticks > 0)
>               SDL_Delay(target_ticks - actual_ticks);
>       last_ticks = current_ticks;
> }
> The framecount here is useless, as there's no use in waiting even longer
> if you haven't met the previous deadline. Then add an actual framerate
> counter in a way that SDL_GetTicks only get called once per frame.
> Also the delay should be completely disabled when vsync is active, else
> it'll only add more useless waiting.
The pseudocode as you have written it above will only take into count
the current frame's used time. And if it is less than it needs to
consume then it is delayed for the difference of those two. If however
your previous frame consumed more time than it should this won't be
taken into account by the above pseudocode. Thus resulting in an exact
match for the current frame (at least for as far as the CPU scheduler
permits it), but an overal drop of the average time consumed per frame
(i.e. framerate).
>> it will never get to the line SDL_Delay(the_delay); because SDL_GetTicks()
>> gets the time in ms since SDL lib is firstly initialized.
>> a wz game frame would probably take quite a few ms so current_ticks is
>> always greater than target_ticks(lastticks + 1000/framelimits * 1(always one
>> because it gets reseted every time)),this bug will end up as a busy cpu
>> loop,that explains why wz uses so much cpu time.
> Well, the delay is only meant to be called when there is time per frame
> left, so if current_ticks is larger than one frame in the target frame
> rate, the game runs too slow and you don't delay.
>> basically the SDL tick caculations are pointless,since the framerateDelay is
>> called per game cycle.
>> so it should be:
>> void SDL_framerateDelay(FPSmanager * manager)
>> {
>> manager->framecount++;
>> SDL_Delay((Uint32)manager->rateticks);
>> manager->lastticks = SDL_GetTicks();
>> }
> Now you add a delay of one targetted frame duration each frame, so you
> can't ever reach that rate.
Yup that ^ will achieve this:
 delay time = target time per frame
 time per frame = required computation time + delay time
which will always result in:
 time per frame > target time per frame

while you really would want this:
 time per frame >= target time per frame
which can be achieved by this formula:
 delay time = target time per frame - required computation time
 time per frame = required computation time + delay time

however this delay calculation would suit better:
 delay time = target time per frame - average(required computation time)

And that last calculation is *exactly* what is performed by the current

That's of course unless framecount becomes reset very often. So that
just leaves wondering how you come to the conclusion that it would
become reset every time.


Attachment: signature.asc
Description: OpenPGP digital signature

Warzone-dev mailing list

Reply via email to