Hi Jose,
thanks for the info, i got it working now:
- Instead of reading depth I just render 256 greylevel quads and
readback the color buffer (performance is not an issue here,
I just need an image that tells me that the depth buffer is ok)
- The GLSL depth-write shader doesn't work when the texture format
is DEPTH_COMPONENT. I've changed it to LUMINANCE16 when running
on Mesa/VMWare.
Again, thanks for the help.
Stefan
Zitat von José Fonseca <jfons...@vmware.com>:
D3D9 API limits the blits to/from depth-stencil buffers as well. The API
is pretty much designed to ensure that depth-stencil buffers stay in
VRAM (probably in a hardware specific layout) and never get out of
there.
Several vendors allow binding the depth buffer as a texture, but they
implicitly do shadow mapping. It might be possible to read the depth
values with certain non-standard depth formats supported by major
vendors. Reading the stencil buffer is pretty much impossible AFAICT.
Jose
On Tue, 2011-01-25 at 06:04 -0800, stef...@starnberg-mail.de wrote:
Hi Jose,
thanks for the quick reply: I'm using Win7 for both, guest (32bit)
and host (64bit).
I do the depth buffer reads only for debugging / regression testing.
Would a copy depth-to-texture and shader blit to the color channels
work ? Reading the color back buffer via glReadPixels is ok.
Regards,
Stefan
Zitat von José Fonseca <jfons...@vmware.com>:
> On Tue, 2011-01-25 at 01:13 -0800, stef...@starnberg-mail.de wrote:
>> Hi,
>>
>> i'm trying to get one of our testsuites running in VMWare
>> (VMware, Inc. Gallium 0.3 on SVGA3D; build: RELEASE; OGL 2.1 Mesa
>> 7.7.1-DEVEL).
>> With the GDI backend everything works fine (tested in 7.7,7.8,7.10).
>>
>> I have a glsl shader that writes depth like
>>
>> void main()
>> {
>> vec4 v = texure2D(tex,gl_TexCoord[0].st);
>>
>> gl_FragColor = vec4(0,0,0,0);
>> gl_FragDepth = v.x;
>> }
>>
>> which doesn't work when running in VMWare's Mesa.
>>
>> Even a simple clear and readback of the depth buffer doesn't work, like:
>> glClearDepth(0.6f);
>> glClear(GL_COLOR_BUFFER_BIT|GL_DEPTH_BUFFER_BIT);
>> glReadPixels(0,0,m_Dim[0], m_Dim[1], GL_DEPTH_STENCIL_EXT,
>> GL_UNSIGNED_INT_24_8_EXT, tmpBufferInt);
>> or
>> glReadPixels(0,0,m_Dim[0], m_Dim[1], GL_DEPTH_COMPONENT, GL_FLOAT,
>> tmpBuffer);
>>
>> Both reads return all zeros.
>>
>> I don't know if VMWare's Mesa is a different branch and if this
is the right
>> place to report those bugs (if it is a bug)
>>
>> Stefan
>
> Stefan,
>
> What guest OS and host OS are you using?
>
> We can only comment here on the open sourced Linux OpenGL guest drivers.
>
> The typical procedure is to file a SR through
> http://www.vmware.com/support/contacts/ , which ensures the issue will
> be included in our internal bug database, and then triage and addressed
> in an eventual release.
>
> That said, I can advance that reading the depth-stencil buffer currently
> doesn't work on Windows hosts due to limitations that D3D9 API imposes
> on locking depth-stencil buffers. But it should work on Linux/MacOSX
> hosts.
>
> Jose
>
>
>
_______________________________________________
mesa-dev mailing list
mesa-dev@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/mesa-dev