I currently have an application that uses several osg::Camera objects 
that each render to an osg::Image

Is there a way to get these Cameras to render to the same texture but 
use distinct real-estate for each camera?
i.e each frame will appear side by side within the texture.

I can render them to the screen by setting their viewports to be 
adjacent to one another but I cant figure out how to save them to a 
texture in this way without moving data around in the texture buffer.

I cant use readPixels or some derivative thereof because I plan to 
render the cameras to a pixel buffer so that they dont have to be 
rendered to the screen.

In case its necessary, I'm trying to render a full 360 view around a 
single point with minimal distortion. I've found the best way to do this 
is to use multiple cameras and line up their frustums so they share a side.

Thanks in advance!
_______________________________________________
osg-users mailing list
[email protected]
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Reply via email to