Re: [Mesa-dev] [mesa 9/9] glx/dri2: Implement getBufferAge

2015-01-20 Thread Ian Romanick
On 01/19/2015 03:00 AM, Chris Wilson wrote:
 Within the DRI2GetBuffers return packet there is a 4-byte field that is
 currently unused by any driver, i.e. flags. With the co-operation of a
 suitably modified X server, we can pass the last SBC on which the buffer
 was defined (i.e. the last SwapBuffers for which it was used) and 0 if
 it is fresh (with a slight loss of precision). We can then compare the
 flags field of the DRIBuffer against the current swap buffers count and
 so compute the age of the back buffer (thus satisfying
 GLX_EXT_buffer_age).
 
 As we reuse a driver specific field within the DRI2GetBuffers packet, we
 first query whether the X/DDX are ready to supply the new value using a
 DRI2GetParam request.
 
 Another caveat is that we need to complete the SwapBuffers/GetBuffers
 roundtrip before reporting the back buffer age so that it tallies
 against the buffer used for rendering. As with all things X, there is a
 race between the query and the rendering where the buffer may be
 invalidated by the server. However, for the primary usecase (that of a
 compositing manager), the DRI2Drawable is only accessible to a single
 client mitigating the impact of the issue.
 
 Signed-off-by: Chris Wilson ch...@chris-wilson.co.uk
 ---
  configure.ac   |  2 +-
  src/glx/dri2_glx.c | 65 
 ++
  2 files changed, 66 insertions(+), 1 deletion(-)
 
 diff --git a/configure.ac b/configure.ac
 index 870435c..ca1da86 100644
 --- a/configure.ac
 +++ b/configure.ac
 @@ -65,7 +65,7 @@ LIBDRM_INTEL_REQUIRED=2.4.52
  LIBDRM_NVVIEUX_REQUIRED=2.4.33
  LIBDRM_NOUVEAU_REQUIRED=2.4.33 libdrm = 2.4.41
  LIBDRM_FREEDRENO_REQUIRED=2.4.57
 -DRI2PROTO_REQUIRED=2.6
 +DRI2PROTO_REQUIRED=2.9
  DRI3PROTO_REQUIRED=1.0
  PRESENTPROTO_REQUIRED=1.0
  LIBUDEV_REQUIRED=151
 diff --git a/src/glx/dri2_glx.c b/src/glx/dri2_glx.c
 index 0577804..b43f115 100644
 --- a/src/glx/dri2_glx.c
 +++ b/src/glx/dri2_glx.c
 @@ -917,6 +917,67 @@ dri2GetBuffersWithFormat(__DRIdrawable * driDrawable,
  }
  
  static int
 +dri2HasBufferAge(int screen, struct glx_display * priv)
 +{
 +   const struct dri2_display *const pdp =
 +(struct dri2_display *)priv-dri2Display;
 +   CARD64 value;
 +
 +   if (pdp-driMajor = 1  pdp-driMinor  4)
 +return 0;
 +
 +   value = 0;
 +   if (!DRI2GetParam(priv-dpy, RootWindow(priv-dpy, screen),
 + DRI2ParamXHasBufferAge, value))
 +  return 0;
 +
 +   return value;
 +}
 +
 +static int
 +dri2GetBufferAge(__GLXDRIdrawable *pdraw)
 +{
 +   struct dri2_drawable *priv = (struct dri2_drawable *) pdraw;
 +   int i, age = 0;
 +
 +   if (priv-swap_pending) {
 +unsigned int attachments[5];

I see other callers that have attachments of at least 8 (although it
appears that intel_query_dri2_buffers only needs 2).  Could we at least
get an assertion or something that priv-bufferCount =
ARRAY_SIZE(attachments)?  A (hypothetical) driver doing stereo rendering
with separate, DDX managed, depth and stencil buffers would need 6.  A
(again, hypothetical) driver with AUX buffers could need... more.

 +DRI2Buffer *buffers;
 +
 +for (i = 0; i  priv-bufferCount; i++)
 +attachments[i] = priv-buffers[i].attachment;
 +
 +buffers = DRI2GetBuffers(priv-base.psc-dpy, priv-base.xDrawable,
 + priv-width, priv-height,
 + attachments, i, i);

Most drivers prefer DRI2GetBuffersWithFormat, and some drivers only use
DRI2GetBuffersWithFormat.  Is mixing DRI2GetBuffersWithFormat and
DRI2GetBuffers going to cause problems or unexpected behavior changes?

 +if (buffers == NULL)
 +return 0;
 +
 +process_buffers(priv, buffers, i);
 +free(buffers);
 +
 +dri2XcbSwapBuffersComplete(priv);
 +   }
 +
 +   if (!priv-have_back)
 +  return 0;
 +
 +   for (i = 0; i  priv-bufferCount; i++) {
 +if (priv-buffers[i].attachment != __DRI_BUFFER_BACK_LEFT)
 +continue;
 +
 +if (priv-buffers[i].flags == 0)
 +continue;
 +
 +age = priv-last_swap_sbc - priv-buffers[i].flags + 1;
 +if (age  0)
 +age = 0;

I was going to comment that this looked like it calculated age wrong
when the buffers had different ages.  Then I realized that age should
only be calculated once.  I think this would be more obvious if the body
of the loop were:

  if (priv-buffers[i].attachment == __DRI_BUFFER_BACK_LEFT 
  priv-buffers[i].flags != 0) {
 age = priv-last_swap_sbc - priv-buffers[i].flags + 1;
 if (age  0)
age = 0;

 break;
  }

I also just noticed that your patches are mixing tabs and spaces (use
spaces only) and are using a mix of 3-space and 8-space (maybe?) indents
(use 3 spaces only).

 +   }
 +
 +   return age;
 +}
 +
 +static int
  dri2SetSwapInterval(__GLXDRIdrawable *pdraw, int interval)
  {
 xcb_connection_t *c = 

Re: [Mesa-dev] [mesa 9/9] glx/dri2: Implement getBufferAge

2015-01-20 Thread Ian Romanick
On 01/20/2015 12:35 PM, Ian Romanick wrote:
 On 01/19/2015 03:00 AM, Chris Wilson wrote:
 +   DRI2Buffer *buffers;
 +
 +   for (i = 0; i  priv-bufferCount; i++)
 +   attachments[i] = priv-buffers[i].attachment;
 +
 +   buffers = DRI2GetBuffers(priv-base.psc-dpy, priv-base.xDrawable,
 +priv-width, priv-height,
 +attachments, i, i);
 
 Most drivers prefer DRI2GetBuffersWithFormat, and some drivers only use
 DRI2GetBuffersWithFormat.  Is mixing DRI2GetBuffersWithFormat and
 DRI2GetBuffers going to cause problems or unexpected behavior changes?

Okay... I hadn't seen the server change until after I sent this review.
 I sent some comments there.

This should be fine, but can we get a comment that it relies on
DRI2GetBuffers re-using the format from the previous
DRI2GetBuffersWithFormat?  That way the next person to look at the code
won't make the same mistake I made.

___
mesa-dev mailing list
mesa-dev@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/mesa-dev