Re: [osg-users] Using pbuffer as a texture.

2015-01-19 Thread Nicolas Baillard
Hi Trajce. Thank you very much for your answer. Unfortunately this isn't what I'm looking for. The prerender example does pretty much the same thing as the distortion example : it creates a window with a view and a scene graph and does render to texture within this graph. I, on the other hand,

Re: [osg-users] Using pbuffer as a texture.

2015-01-19 Thread Robert Osfield
Hi Nicolas, Personally I'd just create a osg::Camera an assign a FBO attachments and a FinalDrawCallback to do a glReadPixels/osg::Image::readPixels() and then store the osg::Image. You could attach this Camera to the main GraphicsContext/Window of your application and just use it's NodeMask to

Re: [osg-users] Using pbuffer as a texture.

2015-01-17 Thread Trajce Nikolov NICK
Hi Nicolas, yes it is possible. Have a look at osgprerender example. You have to attach the texture to the camera. Look for camera-attach(osg::Camera::COLOR_BUFFER, texture,. Nick ___ osg-users mailing list osg-users@lists.openscenegraph.org

[osg-users] Using pbuffer as a texture.

2015-01-16 Thread Nicolas Baillard
Hello everyone. Looking at the screen capture example I see that I can use pixel buffers to do offline rendering. But then how can I use the content of the pixel buffer as a texture ? I've seen the distortion example as well. It does render to texture too, but that's not quiet what I want.