[osg-users] 0bit Stencil Buffer
Hi everyone, I'm trying to use the stencil buffer to render only parts of the final image through other graph-attached cameras. The app works fine on ATI hardware but NVIDIA fails. OpenGL reports that the stencil buffer has only 0 bits (which is 8 bits on ATI, via glGetInterv(GL_STENCIL_BUFFER_BITS) in a DrawCallback). I have tested this on various linux boxes (Ubuntu 9.04, Fedora 11) with recent hardware and recent drivers. I'm pretty sure that a stencil buffer is available on this hardware because OSG seems to be able to utilize it (stereo-mode in osgUtil::SceneView uses stenciling and works fine). I'm pretty sure I've missed something. Is there anything special to do to get or enable a stencil buffer on Nvidia? Thanks for your help and your time! Nice greetings Mathias Buhr ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
Re: [osg-users] 0bit Stencil Buffer
Mathias Buhr wrote: Hi everyone, I'm trying to use the stencil buffer to render only parts of the final image through other graph-attached cameras. The app works fine on ATI hardware but NVIDIA fails. OpenGL reports that the stencil buffer has only 0 bits (which is 8 bits on ATI, via glGetInterv(GL_STENCIL_BUFFER_BITS) in a DrawCallback). I have tested this on various linux boxes (Ubuntu 9.04, Fedora 11) with recent hardware and recent drivers. I'm pretty sure that a stencil buffer is available on this hardware because OSG seems to be able to utilize it (stereo-mode in osgUtil::SceneView uses stenciling and works fine). I'm pretty sure I've missed something. Is there anything special to do to get or enable a stencil buffer on Nvidia? My OSG application uses a stencil buffer on Nvidia hardware, so it is definitely possible. There should be a mechanism (depending on how you create your context) to request a stencil-buffer-enabled context. I'm using FLTK to create my OpenGL context, so my code's example might not be useful to you. It looks like osg/DisplaySettings has a method setMinimumNumStencilBits that may be helpful to you. You could also look at the osgreflect demo. -Eric ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
Re: [osg-users] 0bit Stencil Buffer
Eric Sokolowsky wrote: Mathias Buhr wrote: Hi everyone, I'm trying to use the stencil buffer to render only parts of the final image through other graph-attached cameras. The app works fine on ATI hardware but NVIDIA fails. OpenGL reports that the stencil buffer has only 0 bits (which is 8 bits on ATI, via glGetInterv(GL_STENCIL_BUFFER_BITS) in a DrawCallback). I have tested this on various linux boxes (Ubuntu 9.04, Fedora 11) with recent hardware and recent drivers. I'm pretty sure that a stencil buffer is available on this hardware because OSG seems to be able to utilize it (stereo-mode in osgUtil::SceneView uses stenciling and works fine). I'm pretty sure I've missed something. Is there anything special to do to get or enable a stencil buffer on Nvidia? The stencil property of GraphicsContext::Traits is 0 by default. Just set it to 8 before creating the GraphicsContext and it should work. For example: osg::ref_ptrosg::GraphicsContext::Traits traits = new osg::GraphicsContext::Traits; traits-x = 100; traits-y = 100; traits-width = 640; traits-height = 480; traits-windowDecoration = true; traits-doubleBuffer = true; traits-red = 8; traits-green = 8; traits-blue = 8; traits-depth = 24; traits-stencil = 8; traits-sharedContext = 0; osg::ref_ptrosg::GraphicsContext gc = osg::GraphicsContext::createGraphicsContext(traits.get()); --J ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
Re: [osg-users] 0bit Stencil Buffer
Eric Sokolowsky wrote: Mathias Buhr wrote: Hi everyone, I'm trying to use the stencil buffer to render only parts of the final image through other graph-attached cameras. The app works fine on ATI hardware but NVIDIA fails. OpenGL reports that the stencil buffer has only 0 bits (which is 8 bits on ATI, via glGetInterv(GL_STENCIL_BUFFER_BITS) in a DrawCallback). I have tested this on various linux boxes (Ubuntu 9.04, Fedora 11) with recent hardware and recent drivers. I'm pretty sure that a stencil buffer is available on this hardware because OSG seems to be able to utilize it (stereo-mode in osgUtil::SceneView uses stenciling and works fine). I'm pretty sure I've missed something. Is there anything special to do to get or enable a stencil buffer on Nvidia? My OSG application uses a stencil buffer on Nvidia hardware, so it is definitely possible. There should be a mechanism (depending on how you create your context) to request a stencil-buffer-enabled context. I'm using FLTK to create my OpenGL context, so my code's example might not be useful to you. It looks like osg/DisplaySettings has a method setMinimumNumStencilBits that may be helpful to you. You could also look at the osgreflect demo. -Eric ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org Thank you Eric but unfortunately I've already tried setMinimumNumStencilBits and the osgreflect demo. But a look at the context should be a useful hint. Nice greetings Mathias Buhr ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
Re: [osg-users] 0bit Stencil Buffer
This looks interesting Jason. Thanks! I'll try that tomorrow. Nice greetings Mathias Buhr Jason Daly wrote: Eric Sokolowsky wrote: Mathias Buhr wrote: Hi everyone, I'm trying to use the stencil buffer to render only parts of the final image through other graph-attached cameras. The app works fine on ATI hardware but NVIDIA fails. OpenGL reports that the stencil buffer has only 0 bits (which is 8 bits on ATI, via glGetInterv(GL_STENCIL_BUFFER_BITS) in a DrawCallback). I have tested this on various linux boxes (Ubuntu 9.04, Fedora 11) with recent hardware and recent drivers. I'm pretty sure that a stencil buffer is available on this hardware because OSG seems to be able to utilize it (stereo-mode in osgUtil::SceneView uses stenciling and works fine). I'm pretty sure I've missed something. Is there anything special to do to get or enable a stencil buffer on Nvidia? The stencil property of GraphicsContext::Traits is 0 by default. Just set it to 8 before creating the GraphicsContext and it should work. For example: osg::ref_ptrosg::GraphicsContext::Traits traits = new osg::GraphicsContext::Traits; traits-x = 100; traits-y = 100; traits-width = 640; traits-height = 480; traits-windowDecoration = true; traits-doubleBuffer = true; traits-red = 8; traits-green = 8; traits-blue = 8; traits-depth = 24; traits-stencil = 8; traits-sharedContext = 0; osg::ref_ptrosg::GraphicsContext gc = osg::GraphicsContext::createGraphicsContext(traits.get()); --J ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org