looks like the problem isn't with the code itself, but the scene graph that
gets rendered before somehow didn't write to the depth buffer.
sorry for bothering you,
can
--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=11053#11053
___
Hi,
I'm trying to implement a custom osg::Drawable which need to check the depth
component of a certain pixel before rendering itself. The problem is,
glReadPixels always returns 1.0 (even for the entire viewport, which obviously
should not be possible). I didn't touch the default settings of d
2 matches
Mail list logo