Hi,

I've run into a problem which I suspect is just a gap in my understanding of 
osg and RTT.  Nevertheless, I'm a bit stumped.

My goal is to do a small RTT test where I take a source texture, render it to a 
quad using an RTT camera and then apply the output to another quad.  In other 
words, I want to use RTT and a shader texture to produce the same result as I 
would get if I simply added a texture to a quad geometry's state set and 
rendered it.

I've modified the osgmultiplerendertargets example, with no luck.  The primary 
changes are that I'm using only one output texture, I'm making a texture2D call 
in the frag shader, and that I've used a Texture2D object for my output instead 
of a TextureRect (with normalized coordinates, of course).

I know my texture shader works fine, and it's pretty obvious the error is in 
the RTT side of the graph.

Any thoughts or pointers?

Thanks,

Joel

------------------
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=47840#47840





_______________________________________________
osg-users mailing list
[email protected]
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Reply via email to