Eric Sokolowsky wrote: > Mathias Buhr wrote: > >> Hi everyone, >> >> I'm trying to use the stencil buffer to render only parts of the final >> image through other graph-attached cameras. The app works fine on ATI >> hardware but NVIDIA fails. OpenGL reports that the stencil buffer has >> only 0 bits (which is 8 bits on ATI, via >> glGetInterv(GL_STENCIL_BUFFER_BITS) in a DrawCallback). I have tested >> this on various linux boxes (Ubuntu 9.04, Fedora 11) with recent >> hardware and recent drivers. >> I'm pretty sure that a stencil buffer is available on this hardware >> because OSG seems to be able to utilize it (stereo-mode in >> osgUtil::SceneView uses stenciling and works fine). I'm pretty sure I've >> missed something. Is there anything special to do to get or enable a >> stencil buffer on Nvidia? >> > > My OSG application uses a stencil buffer on Nvidia hardware, so it is > definitely possible. There should be a mechanism (depending on how you > create your context) to request a stencil-buffer-enabled context. I'm > using FLTK to create my OpenGL context, so my code's example might not > be useful to you. > > It looks like osg/DisplaySettings has a method setMinimumNumStencilBits > that may be helpful to you. You could also look at the osgreflect demo. > > -Eric > _______________________________________________ > osg-users mailing list > [email protected] > http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org > > Thank you Eric but unfortunately I've already tried setMinimumNumStencilBits and the osgreflect demo. But a look at the context should be a useful hint.
Nice greetings Mathias Buhr
_______________________________________________ osg-users mailing list [email protected] http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

