Dear all,
I'm writing some GLSL code that needs to access a 1D floating point
texture as input in a vertex shader. My problem is that I'm getting
clamped/normalized (not sure which one) values inside GLSL instead of
the full range values.
For debug purposes I've setup a dummy texture like this:
osg::Image *image = new osg::Image;
float *tmp = new float;
image->setImage(1, 1, 1, GL_R32F, GL_RED, GL_FLOAT,
(unsigned char*)tmp, osg::Image::USE_NEW_DELETE);
tmp[0] = 2
osg::Texture1D *texture = new osg::Texture1D();
texture->setFilter(osg::Texture::MIN_FILTER, osg::Texture::NEAREST);
texture->setFilter(osg::Texture::MAG_FILTER, osg::Texture::NEAREST);
texture->setImage(image);
In this case, the following GLSL expression:
texture1D(the_texture, 0.0).r
returns 1.
But if I change the image setup to:
float *tmp = new float[4];
image->setImage(1, 1, 1, GL_RGBA32F, GL_RGBA, GL_FLOAT,
(unsigned char*)tmp, osg::Image::USE_NEW_DELETE);
it works fine.
Looking at the source code from Texture.cpp I've found that
Texture::computeInternalFormatType() does not deal with GL_R32F,
GL_RG32F, GL_R32UI, ... they all fall into the default clause of the
switch statement which assigns _internalFormatType to NORMALIZED. At the
same time I've found no member function to change that attribute manually.
Is that an omission or am I doing something wrong in the initialization?
If that's an omission, is there an easy workaround that doesn't require
recompiling the library?
Thanks and best regards,
Juan
_______________________________________________
osg-users mailing list
[email protected]
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org