2010/1/19 Kristian Høgsberg <k...@bitplanet.net>:
[snipped]
>> When we work on code in this area we should also consider the case of
>> using a rendering context that's bound to no window at all.  That
>> could be useful for FBO rendering or just compiling shaders, etc.
>> Mesa would probably fail if this were attempted today, but that should
>> get fixed.
> I just implemented this for then intel DRI driver. It is indeed very
> useful and quite straight forward.  My take on the MESA_screen_surface
> + EGL is that we shouldn't try to wrap the KMS API in a GL extension.
> It's a complicated API that is still evolving.  Instead, being able to
> bind a native GEM/TTM buffer to a GL render buffer along with an
> extension to make a context current without requiring a
> surface/drawable is a perfect companion to the KMS API.  I'm finishing
> up a branch with this work now and will post an
> announcement/description later.
That sounds cool.  I am looking forward to it.

If you are working on it for Wayland, I want to share a little about Android's
take on EGL.  In Android, the display server serves the role of a traditional
display server, as well as the window compositor.  There is a single libEGL on
Android that does not have any special extension.  The display server and the
clients use the standard EGL the same way.

As we know, EGL works with surfaces.  They are created from native
windows/pixmaps.  A common problem is that, a native window/pixmap usually has
different structures in the server and in the clients.  There is no
EGLNativeWindowType that can be passed to eglCreateWindowSurface in both the
server and the clients.

Android's approach to the problem is to define EGLNativeWindowType and
EGLNativePixmapType as two interfaces.  The server wraps the fb device in
EGLNativeWindowType interface and creates an EGLSurface for it.  Everything on
screen is composited through OpenGL ES.  The clients can also wrap its window
drawble in the same interface to create an EGLSurface for 3D rendering.

For texture-from-pixmap, Android uses the EGLImage extensions.  An EGLImage can
be created from native pixmaps, texture objects, renderbuffers, or VGImage.
Once created, the EGLImage can be used as texture images, renderbuffer storage,
or VGImage.  It allows pixel sharing between the display server and rendering
APIs.  OpenMax even allows its video frames to be used as EGLImage for
composited hardware video decoding.

When Mesa gains EGLImage support (hopefully, two or three months away), I do
think the GEM objects may be passed to EGL to create EGLImages.  For your
usage, for example, GL_OES_EGL_image may then be used to turn EGLImages to
renderbuffers.

-olv

------------------------------------------------------------------------------
Throughout its 18-year history, RSA Conference consistently attracts the
world's best and brightest in the field, creating opportunities for Conference
attendees to learn about information security's most important issues through
interactions with peers, luminaries and emerging and established companies.
http://p.sf.net/sfu/rsaconf-dev2dev
_______________________________________________
Mesa3d-dev mailing list
Mesa3d-dev@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/mesa3d-dev

Reply via email to