Hi Eric,

On 7/5/06, Eric Sokolowsky <[EMAIL PROTECTED]> wrote:
I have determined that when reading a texture back, Image::_pixelFormat
and Image::_internalTextureFormat should be set to the same value, which
is obtained with a call to glGetTexLevelParameteriv(textureMode, 0,
GL_TEXTURE_INTERNAL_FORMAT, ...). That is fine. But I still haven't
found a way to get the pixel format from OpenGL. Perhaps we need to add
something to the osg::State to track this.

I'm afraid I'm too cold on this topic to provide much insight.


BTW, the set of images I was having trouble with were not a multiple of
four in size. OpenGL will report an error when trying to use a
compressed texture with glTexSubImage2D() if the subloaded texture is
not of a multiple of four in both dimensions. Is this something that
should be checked for by OSG? I have a workaround in my application that
will disable texture compression on images that are not a multiple of
four in size.

I believe the limit of a multiple of 4 is something related to the
S3TC compression where it uses blocks of pixels as its block to do the
compression.
_______________________________________________
osg-users mailing list
[email protected]
http://openscenegraph.net/mailman/listinfo/osg-users
http://www.openscenegraph.org/

Reply via email to