Hi Fred
What texture pixel format are you using? I'm using an integer texture
format. Not sure if OSG's allocateImage method behaves ok in this
case.
I've only used setup code of this style:
image->setImage(size, 1, 1, GL_LUMINANCE32F_ARB, GL_LUMINANCE,
GL_FLOAT, data, osg::Image::NO_DELETE);
I don't know whether for other formats it may fail.
I am creating the texture the following way:
Code: bbp::RTNeuron::TextureBuffer *tb = new
bbp::RTNeuron::TextureBuffer(); tb->setInternalFormat(GL_RGBA8UI); //
4 bytes per pixel, R-G-B-A format as per EXT_texture_integer formats
specification osg::Image *image = new osg::Image();
image->allocateImage(128*128, 1, 1, GL_RGBA_INTEGER,
GL_UNSIGNED_BYTE, 1); // note: width=(128*128), height=1
tb->setImage(0, image);
If you use GL_RGBA_INTEGER then you must use isamplerBuffer or
uisamplerBuffer otherwise the results are undefined.
For a normalized value I don't know what were you expecting but it can't
be the same as with floating point or true integer formats.
The main differences between your code and mine are that I'm setting the
internal texture format explicitly and I'm using a plain array of data
instead RGB/RGBA formats.
Code: #version 150 compatibility #extension GL_EXT_gpu_shader4 :
enable
uniform samplerBuffer tex;
void main(void) { if (textureSizeBuffer(tex) == (128*128)) // size in
pixels gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0); // red (as I am
expecting) else gl_FragColor = vec4(0.0, 1.0, 0.0, 1.0); // green }
The result is always green here, not red as I am expecting.
I haven't used textureSizeBuffer before but from the docs I've read your
code should pretty much work. Is this wrong also with a normalized
texture format or a floating point format?
Cheers,
Juan
_______________________________________________
osg-users mailing list
[email protected]
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org