Hi Umesh,

You seems to be trying to overcomplicate things.  The OSG supports a full
range of pixel formats and data types.   You don't need to lift a finger,
just assign the Image you load and assign it to the texture and it'll do
the right thing by you.   If you need to used different formats then you
can either change the TextureInternalFormat or change the type via
Image::scaleImage() that also allows you to change the format.

My personal advice is just use the OSG with defaults.  It's a mature, well
optimized library, it'll most time do the exactly the right thing for you.
If it turns out there is a bottleneck in your application then take the
next step at optimizing things BUT never decide there is optimizing
bottleneck prior to testing to see if it's real.

Robert.

On 9 December 2014 at 14:11, umesh ramesh <[email protected]> wrote:

> Hi Robert,
>
> Thanks for the prompt response.
> I am not talking about compressed textures. By RGBA8 i mean typically
> R8G8B8A8.
> I believe that OpenGL also interprets in this way. Typically if we use an
> OpenGL App in which we have to read a PNG image that is in RGB8 format, we
> may use LibPNG library to read the data in the correct format, then it
> shall give a pointer to the data (let's say *Data).
>
> Then we shall use glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, width, height,
> 0, GL_RGBA, GL_UNSIGNED_BYTE, Data).
> So technically OpenGL shall interpret as per input & render.
>
> I peeked into the osg::Texture and osg::Image classes, but could not make
> out whether they handle these data formats properly & pass on to OpenGL in
> the correct format as the image format (for example RGBA8).Again, OSG shall
> invoke LibPNG PlugIn for reading the data. It may need to tell that the
> data is in RGBA8 format.
> Or I might need to explicitly tell through osg::Image like
> (image->setImage(width, height, 1 ,GL_RGBA8,GL_RGBA,GL_UNSIGNED_BYTE,
> (unsigned char*)data,osg::Image::NO_DELETE)).
>
> If OSG does handle 8bit/32bit images properly then I should be able to
> save a lot of texture memory I believe.
> But I not able to figure out whether OSG does handle this correctly & pass
> on to OpenGL. Secondly, if it does correctly how shall I confirm this (may
> be as I posted earlier I need to see using an OpenGL debugger tool for
> texture memory usage).
>
> (I actually am working on OpenGL/OSG as I am a Graphics developer).
>
> Thank you!
>
> Cheers,
> umesh
>
> ------------------
> Read this topic online here:
> http://forum.openscenegraph.org/viewtopic.php?p=62006#62006
>
>
>
>
>
> _______________________________________________
> osg-users mailing list
> [email protected]
> http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
>
_______________________________________________
osg-users mailing list
[email protected]
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Reply via email to