Hi guys - I'm using render to texture in one of my apps and I'm
experiencing alpha blending issues. Here's what I have: simple RTT
(render to texture) with two quads. Each quad has as a background an
opaque black background (24-bit BMP). In front of that is a small
(doesn't cover the entire background) 32-bit TGA with varying
transparency. Here's the issue: on the portion of the quad that is
covered by 32-bit TGA I can see right through. It appears as though
OpenGL is doing what it's suppose to do - blending not only the RGB
but also the alpha. For my blending function I'm using (GL_SRC_ALPHA,
GL_ONE_MINUS_SRC_ALPHA) - So, when this 32-bit TGA is blended against
the opaque black background it's blending the alpha components as well
which results in an overall decreased alpha value. Hence, that opaque
black background is now semi-transparent.

What I did to (hopefully) solve this is to premultiply the incoming
fragments via a shader attached to each incoming node that is drawn
into the RTT and use (GL_ONE, GL_ONE_MINUS_SRC_ALPHA) as my blending
function for that node. Seems to solve it. At least now everything is
blending the way I'd expect. The opaque parts are opaque and the
semi-transparent are blending as expected against the opaque
background, etc.

My question is this: what other method might I employ to deal with
these alpha blending issues? How do you guys recommend handling this?
What's the recommend route to take?

<<attachment: Untitled.jpg>>

_______________________________________________
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Reply via email to