Hi Kim,

Probably the easiest way would be to render the scene using a render
to texture Camera and use a texture for the colour buffer and the
depth buffer.  Then in a second pass as  HUD Camera and render the
colour texture directly as a quad on the left hand side, then render
the depth texture as a grey scale on a left hand quad.

The osgdistortion example probably comes closest to doing this type of
thing w.r.t RTT Camera  and HUD Camera setup.

Robert.

On 9/20/07, Kim C Bale <[EMAIL PROTECTED]> wrote:
>
>
>
>
> Hi all,
>
>
>
>
>
> I'm trying to use OSG with the Phillips 3D WOW auto-stereo display which
> uses an unconventional stereo format. Basically you have to split the image
> vertically into two, on the left is the contents of the frame buffer and on
> the right you send a greyscale image of the Z buffer.
>
>
>
> What I would like to know is, what is the fastest way to get access to the Z
> buffer within OSG so that it can be used in such a way?
>
>
>
>
>
> Thanks in advance,
>
>
>
>
>
> Kim.
> *****************************************************************************************
> To view the terms under which this email is distributed, please go to
> http://www.hull.ac.uk/legal/email_disclaimer.html
> *****************************************************************************************
> _______________________________________________
> osg-users mailing list
> [email protected]
> http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
>
>
_______________________________________________
osg-users mailing list
[email protected]
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Reply via email to