On Wed, Oct 17, 2018 at 12:06 AM Keith Packard <kei...@keithp.com> wrote:
>
> Bas Nieuwenhuizen <b...@basnieuwenhuizen.nl> writes:
>
> > Well the complication here is that in the MONOTONIC (not
> > MONOTONIC_RAW) case the CPU measurement can happen at the end of the
> > MONOTONIC_RAW interval (as the order of measurements is based on
> > argument order), so you can get a tick that started `period` (5 in
> > this case) monotonic ticks before the start of the interval and a CPU
> > measurement at the end of the interval.
>
> Ah, that's an excellent point. Let's split out raw and monotonic and
> take a look. You want the GPU sampled at the start of the raw interval
> and monotonic sampled at the end, I think?
>
>                  w x y z 0 1 2 3 4 5 6 7 8 9 a b c d e f
> Raw              -_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-
>
>           0         1         2         3
> GPU       -----_____-----_____-----_____-----_____
>
>                                     x y z 0 1 2 3 4 5 6 7 8 9 a b c
> Monotonic                           -_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-
>
> Interval                     <----------------->
> Deviation           <-------------------------->
>
>         start = read(raw)       2
>         gpu   = read(GPU)       1
>         mono  = read(monotonic) 2
>         end   = read(raw)       b
>
> In this case, the error between the monotonic pulse and the GPU is
> interval + gpu_period (probably plus one to include the measurement
> error of the raw clock).
>
> Thanks for finding this case.
>
> Now, I guess the question is whether we want to try and find the
> smallest maxDeviation possible for each query. For instance, if the
> application asks only for raw and gpu, the max_deviation could be
> max2(interval+1,gpu_period), but if it asks for monotonic and gpu, it
> would be interval+1+gpu_period. I'm not seeing a simple definition
> here...

You can make the monotonic case the same as the raw case if you make
sure to always sample the CPU first by e.g. splitting the loops into
two and doing CPU in the first and GPU in the second. That way you
make the case above impossible.

That said "start of the interval of the tick" is kinda arbitrary and
you could pick random other points on that interval, so depending on
what requirements you put on it (i.e. can the chosen position be
different per call, consistent but implicit or explicitly picked which
might be leaked through the interface) the max deviation changes. So
depending on interpretation this thing can be very moot ...


>
> --
> -keith
_______________________________________________
dri-devel mailing list
dri-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/dri-devel

Reply via email to