Robert,

I'm unable to get pixel values > 1.0 when rendering to an image even
though I set the internal texture format of the image to GL_RGB16F*.
This is a problem when trying to capture HDR scenes.  I know this is
possible since I'm doing it with a small OpenGL project I'm working on.

I'm unsure if this is the problem but it appears that the destination
image in the render to image option (using FBO) overrides the
_internalTextureFormat with a default type of GL_RGBA.  

The break down is this:

In RenderStage::drawInner:

...

        GLenum pixelFormat = _image->getPixelFormat();
        if (pixelFormat==0) pixelFormat = _imageReadPixelFormat;
        if (pixelFormat==0) pixelFormat = GL_RGB;

        GLenum dataType = _image->getDataType();
        if (dataType==0) dataType =  _imageReadPixelDataType;
        if (dataType==0) dataType = GL_UNSIGNED_BYTE;       
        
        _image->readPixels(_viewport->x(), _viewport->y(),
                           _viewport->width(), _viewport->height(), 
                           pixelFormat, dataType);

pixelFormat at this point is GL_RGBA.

In _image->readPixels,  allocateImage is called which resets the
_internalTextureFormat to pixelFormat: ...

    if (_data)
    {
        _s = s;
        _t = t;
        _r = r;
        _pixelFormat = format;
        _dataType = type;
        _packing = packing;
        _internalTextureFormat = format;
    }

I'm not sure at what point _internalTextureFormat is used, but this
appears to be the problem I'm having.

Robert, can you shed some light on this.  I'll gladly fix the issue and
submit if you can point me in the right direction.

Thanks,
Brad


---
Renaissance Sciences Corporation
O/M: 480 290-3997
F:   425 675-8044
_______________________________________________
osg-users mailing list
[email protected]
http://openscenegraph.net/mailman/listinfo/osg-users
http://www.openscenegraph.org/

Reply via email to