Hi ?

Have a look at the osgdistortion example or the
osgViewer::setUp3DSphericalDisplay() code - its sets up six slave
cameras which rendering into a texture cube map, and then a final
slave camera to render the final distortion correction mesh.

Robert.

On Mon, Mar 10, 2008 at 9:10 PM, spowers <[EMAIL PROTECTED]> wrote:
> I currently have an application that uses several osg::Camera objects
>  that each render to an osg::Image
>
>  Is there a way to get these Cameras to render to the same texture but
>  use distinct real-estate for each camera?
>  i.e each frame will appear side by side within the texture.
>
>  I can render them to the screen by setting their viewports to be
>  adjacent to one another but I cant figure out how to save them to a
>  texture in this way without moving data around in the texture buffer.
>
>  I cant use readPixels or some derivative thereof because I plan to
>  render the cameras to a pixel buffer so that they dont have to be
>  rendered to the screen.
>
>  In case its necessary, I'm trying to render a full 360 view around a
>  single point with minimal distortion. I've found the best way to do this
>  is to use multiple cameras and line up their frustums so they share a side.
>
>  Thanks in advance!
>  _______________________________________________
>  osg-users mailing list
>  osg-users@lists.openscenegraph.org
>  http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
>
_______________________________________________
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Reply via email to