Juan Hernando wrote:
I'll be very busy until next Friday so I can't answer to you properly. A
quick answer is that I just replicated what I saw inside the OSG code
for other textures. If you're sure that the binding is not needed,
removte it. I'll try to come back to this issue later and
Hi Fred,
The call to glTextureBufferEXT, during the rendering, is not needed. It is only
needed when you prepare the Texture Buffer, but not during the display.
Does it make sense to you if I comment out the following line of code below
(see // COMMENTED OUT, below):
I'll be very busy until
Hi Juan,
I finally managed to make it work. Can't understand exactly what was wrong, I
got mangled into several issues at the same time (signed/unsigned sampler1D vs
buffer, ATI (bogus) compiler vs nvidia, texture originally too large, incorrect
pixel formats...). I just ended up checking
Hi Fred,
Good to know that you could make it.
For sure, I don't think my implementation is the best posible. I just
wanted to get something working quickly. My idea is that someone else
can take it as a starting point for a future OSG implementation.
Cheers,
Juan
Hi,
My simple fragment shader with textureSize() seem to work fine when dealing
with a usampler1D uniform (the problem I had at one point was that I was giving
a texture that was too big for the driver, hence the size ended up being
actually 1, and my test was always failing).
Hi Fred,
If you can provide a minimal code example I can test it on my machine
and see what's the problem. Otherwise, for me it's impossible to know
what's going on and I can only guess. By the way, did you try using
GL_LUMINANCE instead of GL_RGBA?
Cheers,
Juan
Hi Juan,
Thanks for your reply. I'm still having problems trying to make use of
texelFetchBuffer with your code.
Why texture pixel format are you using? I'm using an integer texture format.
Not sure if OSG's allocateImage method behaves ok in this case.
I am creating the texture the following
fred_em wrote:
Code:
[...]
tb-setInternalFormat(GL_RGBA8UI); // 4 bytes per pixel, R-G-B-A format as
per EXT_texture_integer formats specification
osg::Image *image = new osg::Image();
image-allocateImage(128*128, 1, 1, GL_RGBA_INTEGER, GL_UNSIGNED_BYTE, 1); //
note: width=(128*128),
Hi Fred
What texture pixel format are you using? I'm using an integer texture
format. Not sure if OSG's allocateImage method behaves ok in this
case.
I've only used setup code of this style:
image-setImage(size, 1, 1, GL_LUMINANCE32F_ARB, GL_LUMINANCE,
GL_FLOAT, data,
Hi,
When I compile your code I get the following errors:
Code:
error C2065: 'GL_TEXTURE_BUFFER_EXT' : undeclared identifier
error C2039: 'Extensions' : is not a member of 'osg::BufferObject'
error C2039: 'getExtensions' : is not a member of 'osg::BufferObject'
[...]
I don't see any
Hi,
OK, I temporarily replaced Extensions with GLEW, things moved forward, but I'm
stuck with the following compilation error now:
Code:
} else if (_image.valid() _image-data()) {
/* Temporary copy */
osg::ref_ptrosg::Image image = _image;
/* Creating the texture
Hi,
Sorry for not answering before but I've been away from the computer.
Regarding osg::BufferObject, in my OSG version (2.8.3) there is an
Extensions class declared inside osg::BufferObject. Of course you can
replace it with GLEW, but probably the other compile error is also
related to the
Hi Juan,
I managed to make some progress.
- I replaced osg::BufferObject::Extensions with GLEW
- I changed the other offending expression:
generateTextureObject(contextID, GL_TEXTURE_BUFFER_EXT);
to
generateTextureObject(this, contextID, GL_TEXTURE_BUFFER_EXT);
Things compile fine.
I
I figured out why texelFetch wasn't being compiled successfully.
1) I forgot the last argument 2) I last tried on AMD/ATI hardware, and the GLSL
compiler on this platform doesn't seem to actually recognize 'texelFetch'...
Now moving on with my tests.
--
Read this topic online
Hi Fred,
I have never used TBOs before. How do you use your new class?
Is the following correct?
Code:
// Creation
osg::TextureBuffer *tb = new osg::TextureBuffer();
// This will create a buffer of 300 * 4 bytes = 1200 bytes
osg::Image *image = new osg::Image();
image-allocateImage(300
Hi Fred,
As far as I know, they are not planeed to be part of OSG's API for 3.0.
I wrote a class for dealing with these textures based on the code for
other texture objects. The implementation can be improved to resuse
Texture::TextureObject instead or redeclaring its own
Juan Hernando wrote:
Hi Fred,
As far as I know, they are not planeed to be part of OSG's API for 3.0.
I wrote a class for dealing with these textures based on the code for
other texture objects. The implementation can be improved to resuse
Texture::TextureObject instead or redeclaring its
Hi Juan,
Sounds great. Your forum settings are not configured to accept
private messages. I'm interested in your work, if you're willing to
share some code with me drop me an email at fclXYZ.gvsat gmail.com
(replace 'XYZ' with 'aux')
I prefer sending them to everybody so they can be improved
18 matches
Mail list logo