When rendering to a texture, what determines the RenderBuffer (internal) 
format?  I would have thought it should use the texture's internal format to 
define the RenderBuffer format, but I don't see this in the code. However, I do 
see this in FrameBufferAttachment::FrameBufferAttachment(Camera::Attachment& 
attachment) when given an image pointer attachment:
    osg::Image* image = attachment._image.get();
    if (image)
    {
        if (image->s()>0 && image->t()>0)
        {
            GLenum format = attachment._image->getInternalTextureFormat();
            if (format == 0)
                format = attachment._internalFormat;
            _ximpl = new Pimpl(Pimpl::RENDERBUFFER);
            _ximpl->renderbufferTarget = new osg::RenderBuffer(image->s(), 
image->t(), format);
        }
        else
        {
            osg::notify(osg::WARN)<<"Error: 
FrameBufferAttachment::FrameBufferAttachment(Camera::Attachment&) passed an 
empty osg::Image, image must be allocated first."<<std::endl;
        }
        return;
    }
Do I have to set the internal format manually by calling the Camera->attach 
twice:
   camera->attach(osg::Camera::COLOR_BUFFER, texture, 0, 0, false, samples, 
colorSamples);
   camera->attach(osg::Camera::COLOR_BUFFER, GL_LUMINANCE32F_ARB);
I don't see in the code where the camera ::Attachment._internalFormat is being 
initialized when given a texture.
Am I missing something? Is there a better example of this than the 
osgprerender.cpp example?
Paul P.


      
_______________________________________________
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Reply via email to