Hi everyone

I'm using GLSL shaders with OpenSceneGraph for some scientific
visualisation. I've got sample values in a n*m floating point texture,
and a colour lookup table in a 1-D texture.

I'm sticking those textures on a osg::TexturedQuadGeometry like this:
state->setTextureAttributeAndModes(texnum_red, red_texture, 
osg::StateAttribute::OFF);
state->setTextureAttributeAndModes(texnum_green, green_texture, 
osg::StateAttribute::OFF);
state->setTextureAttributeAndModes(texnum_blue, blue_texture, 
osg::StateAttribute::OFF);
state->setTextureAttributeAndModes(texnum_seismic, seismic_tex, 
osg::StateAttribute::OFF);

Then passing them into the GLSL program:
state->addUniform(new osg::Uniform("cdef_red", texnum_red));
state->addUniform(new osg::Uniform("cdef_green", texnum_green));
state->addUniform(new osg::Uniform("cdef_blue", texnum_blue));
state->addUniform(new osg::Uniform("seismic", texnum_seismic));

When I zoom away from the resulting plane, I'm seeing artifacting. After
some debugging it looks like the colour texture (and probably the data
texture) are being minified. My shader (which expects the linear colour
scale with 256 values) thus indexes into a colour scale which has been
minified and has less values.

Does anyone know how I can turn off minification and break the
association betwen these textures and the current camera position? I
want to treat them as plain old data that gets passed to my shader, and
then index into them to determine the output colour.

Thanks for any help!

Grahame


_______________________________________________
osg-users mailing list
[email protected]
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Reply via email to